It looks like the beginning of a new technology hype for artificial intelligence (AI). The media has started flooding the news with product announcements, acquisitions, and investments. The story is how AI is capturing the attention of tech firm and investor giants such as Google, Microsoft, IBM. Add to that the release of the movie ‘Her’, about a man falling for his virtual assistant modeled after Apple’s Siri (think they got the idea from Big Bang Theory when Raj falls in love with Siri), and you know we have begun the journey of geek-dom going mainstream and cool. The buzz words are great too: cognitive computing, deep learning, AI2.
For those who started their careers in AI and left in disillusionment (Andrew Ng confessed to this, yet jumped back in) or data scientists today, the consensus is often that artificial intelligence is just a new fancy marketing term for good old predictive analytics. They point to the reality of Apple’s Siri to listen and respond to requests as adequate but more often frustrating. Or, IBM Watson’s win on Jeopardy as data loading and brute force programming. Their perspective, real value is the pragmatic logic of the predictive analytics we have.
But, is this fair? No.
First, let’s set aside what you heard about financial puts and takes. Don’t try to decipher the geek speak of what new AI is compared to old AI. Let’s talk about what is on the horizon that will impact your business.
New AI breaks the current rule that machines must be better than humans: they must be smarter, faster analysts, or they manufacturing things better and cheaper.
IBM launched on January 9, 2014 its first business unit in 19 years to bring Watson, the machine that beat two Jeopardy champions in 2011, to the rest of us. IBM posits that Watson is the start of a third era in computing that started with manual tabulation, progressed to programmable, and now has become cognitive. Cognitive computing listens, learns, converses, and makes recommendations based on evidence.
IBM is placing big bets and big money, $1 billion, on transforming computer interaction from tabulation and programming to deep engagement. If they succeed, our interaction with technology will truly be personal through interactions and natural conversations that are suggestive, supportive, and as Terry Jones of Kayak explained, "makes you feel good" about the experience.
There are still hurdles for IBM and organizations, such as expense, complexity, information access, coping with ambiguity and context, the supervision of learning, and the implications of suggestions that are unrecognized today. To work, the ecosystem has to be open and communal. Investment is needed beyond the platform for applications and devices to deliver on Watson value. IBM's commitment and leadership are in place. The question is if IBM and its partners can scale Watson to be something more than a complex custom solution to become a truly transformative approach to businesses and our way of life.
Forrester believes that cognitive computing has the potential to address important problems that are unmet with today’s advanced analytics solutions. Though the road ahead is unmapped, IBM has now elevated its commitment to bring cognitive computing to life through this new business unit and the help of one third of its research organization, an ecosystem of partners, and pioneer companies willing to teach their private Watsons.
Last year, my colleague Srividya Sridharan published The State Of Customer Analytics 2012 (subscription required). Using the results of her annual customer analytics adoption survey, she uncovered key trends of how customer analytics practitioners use and adopt various advanced analytics across the customer life cycle and highlighted challenges and drivers associated with customer analytics.
This year, I have the pleasure of teaming up with Sri on her yearly survey, to further explore the adoption of advanced analytics, measurement, and attribution. Please read her blog post to learn more about the survey. This survey will explore the adoption and usage of measurement techniques, including attribution, and the adoption of advanced analytics methodologies. With this expanded survey we want to understand how you use and apply measurement and analytics in your organization to optimize both cross-channel marketing campaigns and customer programs.
In particular, we’re fielding questions to understand the goals and challenges associated with measurement and analytics, the adoption and application of measurement and advanced analytics methods, the use of several marketing and customer metrics, the customer insights process and workflow, and the organizational aspects that support measurement and analytics. We encourage you to participate in this survey, as this information will help you benchmark your measurement and analytics adoption efforts.
Buy analytics software, hire marketing scientists, and engage analytics consultants. Now wait for the magic of customer analytics to happen. Right?
Wrong. Building a successful customer analytics capability involves careful orchestration of several capabilities and requires customer insights (CI) professionals to answer some key questions about their current state of customer analytics:
What is the level of importance given to customer analytics in your organization?
Have you clearly defined where you will use the output of customer analytics?
How is your analytics team structured and supported?
How do you manage and process your customer data?
Do you have clear line of sight between analytics efforts and business outcomes?
What is the process of sharing insights from analytics projects?
What type of technology do you need to produce, consume and activate analytics?
Since 2010, when Forrester asks about organizations’ top software priorities, the number one ranked priority has been business intelligence (BI). Continued economic uncertainty and major industry-changing dynamics like mobility and the shift to digital business put a premium on data and information. The ability to effectively extract, analyze, and interpret vast quantities of data has simply become critical to business strategy decisions. Investments in BI analytics reflect the importance being placed on these technologies.
However, the large number of analytics technologies at differing levels of maturity and adoption has, in many cases, left planners of BI confused as to which technology should be adopted and for which scenario.
As a result, my colleague, Holger Kisker, and I used Forrester’s TechRadar methodology to examine 15 key analytics technologies to identify their usage scenario, current maturity within the enterprise, future trajectory, key vendors, as well as estimated costs for implementation. The technologies analyzed included the following: reporting, dashboards, performance analytics, embedded analytics, web analytics, process analytics, predictive analytics, OLAP, advanced visualization, metadata-generated analytics, location analytics, search/discovery, streaming analytics, nonmodeled data exploration and discovery, and finally text analytics. Forrester clients can read the full report here.
The Obama 2012 campaign famously used big data predictive analytics to influence individual voters. They hired more than 50 analytics experts, including data scientists, to predict which voters will be positively persuaded by political campaign contact such as a call, door knock, flyer, or TV ad. Uplift modeling (aka persuasion modeling) is one of the hottest forms of predictive analytics, for obvious reasons — most organizations wish to persuade people to to do something such as buy! In this special episode of Forrester TechnoPolitics, Mike interviews Eric Siegel, Ph.D., author of Predictive Analytics, to find out: 1) What exactly is uplift modeling? and 2) How did the Obama 2012 campaign use it to persuade voters? (< 4 minutes)
The deluge of customer data shows no signs of abating. The perpetually-connected customer leaves data footprints in every interaction with a brand. This presents tremendous opportunities for customer insights professionals and analytics practitioners tasked with analyzing this data, to not only get smarter about customers but ensure that the insights get appropriately used at the point of customer interaction.
When we asked customer analytics users about the challenges and drivers of customer analytics adoption, we found that data integration and data quality continue to inhibit better adoption of customer analytics while users still want to use analytics to improve the data-driven focus of the organization and drive satisfaction and customer retention.
Forrester’s Customer Analytics Playbook guides customer insights professionals, marketing scientists and customer analytics practitioners into this new reality of customer data and helps discover analytics opportunities, plan for greater sophistication, take steps towards building a customer analytics capability and continually monitor progress of analytics initiatives. It will include 12 chapters (and an executive overview) that cover different aspects of customer analytics.
I just received yet another call from a reporter asking me to comment on yet another BI vendor announcing R integration. All leading BI vendors are embedding/integrating with R these days, so I was not sure what was really new in the announcement. I guess the real question is the level of integration. For example:
Since R is a scripting language, does a BI vendor provide point-and-click GUI to generate R code?
Can R routines leverage and take advantage of all of the BI metadata (data structures, definitions, etc.) without having to redefine it again just for R?
How easily can the output from R calculations (scores, rankings) be embedded in the BI reports and dashboards? Do the new scores just become automagically available for BI reports, or does somebody need to add them to BI data stores and metadata?
Can the BI vendor import/export R models based on PMML?
Is it a general R integration, or are there prebuilt vertical (industry specific) or domain (finance, HR, supply chain, risk, etc) metrics as part of a solution?
What server are R models executed in? Reporting server? Database server? Their own server?
Then there's the whole business of model design, management, and execution, which is usually the realm of advanced analytics platforms. How much of these capabilities does the BI vendor provide?
Did I get that right? Any other features/capabilities that really distinguish one BI/R integration from another? Really interested in hearing your comments.
I’m excited to announce that our new research on how firms use customer analytics was just published today. The new research reveals some interesting findings:
Customer analytics serves the customer lifecycle , but measurement is restricted to marketing activities. While customer analytics continues to drive acquisition and retention goals, firms continue to measure success of customer analytics using easy-to-track marketing metrics as opposed to deeper profitability or engagement measures.
Finding the right analytics talent remains challenging . It’s not the just the data. It’s not the just technology that hinders analytics success. It’s the analytical skills required to use the data in creative ways, ask the right questions of the data, and use technology as a key enabler to advance sophistication in analytics. We’ve talked about how customer intelligence (CI) professionals need a new breed of marketing scientist to elevate the consumption of customer analytics.
CI professionals are keen to use predictive analytics in customer-focused applications, Forty percent of respondents to our Global Customer Analytics Adoption Survey tell us that they have been using predictive analytics for less than three years, while more than 70% of respondents have been using descriptive analytics and BI-type reporting for more than 10 years. CI professionals have not yet fully leveraged the strengths of predictive analytics customer applications.
Earlier this week Dell joined arch-competitor HP in endorsing ARM as a potential platform for scale-out workloads by announcing “Copper,” an ARM-based version of its PowerEdge-C dense server product line. Dell’s announcement and positioning, while a little less high-profile than HP’s February announcement, is intended to serve the same purpose — to enable an ARM ecosystem by providing a platform for exploring ARM workloads and to gain a visible presence in the event that it begins to take off.
Dell’s platform is based on a four-core Marvell ARM V7 SOC implementation, which it claims is somewhat higher performance than the Calxeda part, although drawing more power, at 15W per node (including RAM and local disk). The server uses the PowerEdge-C form factor of 12 vertically mounted server modules in a 3U enclosure, each with four server nodes on them for a total of 48 servers/192 cores in a 3U enclosure. In a departure from other PowerEdge-C products, the Copper server has integrated L2 network connectivity spanning all servers, so that the unit will be able to serve as a low-cost test bed for clustered applications without external switches.
Dell is offering this server to selected customers, not as a GA product, along with open source versions of the LAMP stack, Crowbar, and Hadoop. Currently Cannonical is supplying Ubuntu for ARM servers, and Dell is actively working with other partners. Dell expects to see OpenStack available for demos in May, and there is an active Fedora project underway as well.