When Sir Francis Bacon, coined the aphorism "Knowledge is power", he didn’t foresee a 21st century where technology and data science would more automatically and immediately turn knowledge into insight. Today, the phrase “Prediction is Power” may be more appropriate.
There’s no other way to slice it: competition for digital audiences is brutal. Intolerance for poor performance and disengaging experiences drives customers to competitor’s sites more quickly and more permanently than any time in history. Users increasingly demand digital experiences that personalize to their immediate needs and adapt to the current context, not treat them as a market or demographic segment.
In recently published research, we found that even as expectations soar, enterprises are personalizing with methods that are too unsophisticated, too opaque, or too convoluted to meet the complexity and mutability needed to serve individuals. Persona-based segmentation is too simplistic to meet current, much less future, customer expectations. Some solutions provide predictive analytics capabilities but are limited to a few algorithms or black-box methods (e.g. neural networks) are not easily adaptable to new data or scenarios. Those that rely heavily on rules have become morasses, some customers needing to manage and maintain hundreds or thousands of rules to guide digital experiences.
Day one of the first Cognitive Computing Forum in San Jose, hosted by Dataversity, gave a great perspective on the state of cognitive computing; promising, but early. I am here this week with my research director Leslie Owens and analyst colleague Diego LoGudice. Gathering research for a series of reports for our cognitive engagement coverage, we were able to debrief tonight on what we heard and the questions these insights raise. Here are some key take-aways:
1) Big data mind shift to explore and accept failure is a heightened principle. Chris Welty, formerly at IBM and a key developer of Watson and it's Jeoapardy winning solution, preached restraint. Analytic pursuit of perfect answers delivers no business value. Keep your eye on the prize and move the needle on what matters, even if your batting average is only .300 (30%). The objective is a holistic pursuit of optimization.
2) The algorithms aren't new, the platform capabilities and greater access to data allow us to realize cognitive for production uses. Every speaker from academic, vendor, and expert was in agreement that the algorithms created decades ago are the same. Hardware and the volume of available data have made neural networks and other machine learning algorithms both possible and more effective.
It looks like the beginning of a new technology hype for artificial intelligence (AI). The media has started flooding the news with product announcements, acquisitions, and investments. The story is how AI is capturing the attention of tech firm and investor giants such as Google, Microsoft, IBM. Add to that the release of the movie ‘Her’, about a man falling for his virtual assistant modeled after Apple’s Siri (think they got the idea from Big Bang Theory when Raj falls in love with Siri), and you know we have begun the journey of geek-dom going mainstream and cool. The buzz words are great too: cognitive computing, deep learning, AI2.
For those who started their careers in AI and left in disillusionment (Andrew Ng confessed to this, yet jumped back in) or data scientists today, the consensus is often that artificial intelligence is just a new fancy marketing term for good old predictive analytics. They point to the reality of Apple’s Siri to listen and respond to requests as adequate but more often frustrating. Or, IBM Watson’s win on Jeopardy as data loading and brute force programming. Their perspective, real value is the pragmatic logic of the predictive analytics we have.
But, is this fair? No.
First, let’s set aside what you heard about financial puts and takes. Don’t try to decipher the geek speak of what new AI is compared to old AI. Let’s talk about what is on the horizon that will impact your business.
New AI breaks the current rule that machines must be better than humans: they must be smarter, faster analysts, or they manufacturing things better and cheaper.
IBM launched on January 9, 2014 its first business unit in 19 years to bring Watson, the machine that beat two Jeopardy champions in 2011, to the rest of us. IBM posits that Watson is the start of a third era in computing that started with manual tabulation, progressed to programmable, and now has become cognitive. Cognitive computing listens, learns, converses, and makes recommendations based on evidence.
IBM is placing big bets and big money, $1 billion, on transforming computer interaction from tabulation and programming to deep engagement. If they succeed, our interaction with technology will truly be personal through interactions and natural conversations that are suggestive, supportive, and as Terry Jones of Kayak explained, "makes you feel good" about the experience.
There are still hurdles for IBM and organizations, such as expense, complexity, information access, coping with ambiguity and context, the supervision of learning, and the implications of suggestions that are unrecognized today. To work, the ecosystem has to be open and communal. Investment is needed beyond the platform for applications and devices to deliver on Watson value. IBM's commitment and leadership are in place. The question is if IBM and its partners can scale Watson to be something more than a complex custom solution to become a truly transformative approach to businesses and our way of life.
Forrester believes that cognitive computing has the potential to address important problems that are unmet with today’s advanced analytics solutions. Though the road ahead is unmapped, IBM has now elevated its commitment to bring cognitive computing to life through this new business unit and the help of one third of its research organization, an ecosystem of partners, and pioneer companies willing to teach their private Watsons.
Last year, my colleague Srividya Sridharan published The State Of Customer Analytics 2012 (subscription required). Using the results of her annual customer analytics adoption survey, she uncovered key trends of how customer analytics practitioners use and adopt various advanced analytics across the customer life cycle and highlighted challenges and drivers associated with customer analytics.
This year, I have the pleasure of teaming up with Sri on her yearly survey, to further explore the adoption of advanced analytics, measurement, and attribution. Please read her blog post to learn more about the survey. This survey will explore the adoption and usage of measurement techniques, including attribution, and the adoption of advanced analytics methodologies. With this expanded survey we want to understand how you use and apply measurement and analytics in your organization to optimize both cross-channel marketing campaigns and customer programs.
In particular, we’re fielding questions to understand the goals and challenges associated with measurement and analytics, the adoption and application of measurement and advanced analytics methods, the use of several marketing and customer metrics, the customer insights process and workflow, and the organizational aspects that support measurement and analytics. We encourage you to participate in this survey, as this information will help you benchmark your measurement and analytics adoption efforts.
Buy analytics software, hire marketing scientists, and engage analytics consultants. Now wait for the magic of customer analytics to happen. Right?
Wrong. Building a successful customer analytics capability involves careful orchestration of several capabilities and requires customer insights (CI) professionals to answer some key questions about their current state of customer analytics:
What is the level of importance given to customer analytics in your organization?
Have you clearly defined where you will use the output of customer analytics?
How is your analytics team structured and supported?
How do you manage and process your customer data?
Do you have clear line of sight between analytics efforts and business outcomes?
What is the process of sharing insights from analytics projects?
What type of technology do you need to produce, consume and activate analytics?
The Obama 2012 campaign famously used big data predictive analytics to influence individual voters. They hired more than 50 analytics experts, including data scientists, to predict which voters will be positively persuaded by political campaign contact such as a call, door knock, flyer, or TV ad. Uplift modeling (aka persuasion modeling) is one of the hottest forms of predictive analytics, for obvious reasons — most organizations wish to persuade people to to do something such as buy! In this special episode of Forrester TechnoPolitics, Mike interviews Eric Siegel, Ph.D., author of Predictive Analytics, to find out: 1) What exactly is uplift modeling? and 2) How did the Obama 2012 campaign use it to persuade voters? (< 4 minutes)
The deluge of customer data shows no signs of abating. The perpetually-connected customer leaves data footprints in every interaction with a brand. This presents tremendous opportunities for customer insights professionals and analytics practitioners tasked with analyzing this data, to not only get smarter about customers but ensure that the insights get appropriately used at the point of customer interaction.
When we asked customer analytics users about the challenges and drivers of customer analytics adoption, we found that data integration and data quality continue to inhibit better adoption of customer analytics while users still want to use analytics to improve the data-driven focus of the organization and drive satisfaction and customer retention.
Forrester’s Customer Analytics Playbook guides customer insights professionals, marketing scientists and customer analytics practitioners into this new reality of customer data and helps discover analytics opportunities, plan for greater sophistication, take steps towards building a customer analytics capability and continually monitor progress of analytics initiatives. It will include 12 chapters (and an executive overview) that cover different aspects of customer analytics.
I just received yet another call from a reporter asking me to comment on yet another BI vendor announcing R integration. All leading BI vendors are embedding/integrating with R these days, so I was not sure what was really new in the announcement. I guess the real question is the level of integration. For example:
Since R is a scripting language, does a BI vendor provide point-and-click GUI to generate R code?
Can R routines leverage and take advantage of all of the BI metadata (data structures, definitions, etc.) without having to redefine it again just for R?
How easily can the output from R calculations (scores, rankings) be embedded in the BI reports and dashboards? Do the new scores just become automagically available for BI reports, or does somebody need to add them to BI data stores and metadata?
Can the BI vendor import/export R models based on PMML?
Is it a general R integration, or are there prebuilt vertical (industry specific) or domain (finance, HR, supply chain, risk, etc) metrics as part of a solution?
What server are R models executed in? Reporting server? Database server? Their own server?
Then there's the whole business of model design, management, and execution, which is usually the realm of advanced analytics platforms. How much of these capabilities does the BI vendor provide?
Did I get that right? Any other features/capabilities that really distinguish one BI/R integration from another? Really interested in hearing your comments.