One of the developing trends in computing, relevant to both enterprise and service providers alike, is the notion of workload-specific or application-centric computing architectures. These architectures, optimized for specific workloads, promise improved efficiencies for running their targeted workloads, and by extension the services that they support. Earlier this year we covered the basics of this concept in “Optimize Scalable Workload-Specific Infrastructure for Customer Experiences”, and this week HP has announced a pair of server cartridges for their Moonshot system that exemplify this concept, as well as being representative of the next wave of ARM products that will emerge during the remainder of 2014 and into 2015 to tilt once more at the x86 windmill that currently dominates the computing landscape.
Specifically, HP has announced the ProLiant m400 Server Cartridge (m400) and the ProLiant m800 Server Cartridge (m800), both ARM-based servers packaged as cartridges for the HP Moonshot system, which can hold up to 45 of these cartridges in its approximately 4U enclosure. These servers are interesting from two perspectives – that they are both ARM-based products, one being the first tier-1 vendor offering of a 64-bit ARM CPU and that they are both being introduced with a specific workload target in mind for which they have been specifically optimized.
This Forum will help you identify brand new software opportunities and run with them. It will hit on the must-have competencies that will empower application development and delivery leaders to execute on their company’s engagement strategies. This includes accelerating development processes, creating digital experiences, reaching mobile customers, and exploiting analytics and big data. Forrester analysts will deliver forward-thinking content while industry specialists – from companies such as McDonald’s, Mastercard, and GE Capital - will provide insight into some real and revolutionary new business approaches that are relevant to you right now.
Recent news of a a computer program that passed the Turing Test is a great achievement for artificial intelligence (AI). Pulling down the barrier between human and machine has been a decades long holy grail pursuit. Right now, it is a novelty. In the near future, the implications are immense.
Which brings us to why should you care.
Earlier this week the House majority leader, Eric Cantor, suffered an enormous defeat in Virginia's Republican primary by Tea Party candidate David Brat. No one predicted this - the polls were wrong, by a long shot. Frank Luntz, a Republican pollster and communication advisor, offered up his opinion on what was missing in a New York Times Op-Ed piece - lack of face-to-face discussions and interviews with voters. He asserts that while data collection was limited to discrete survey questions, what it lacked was context. Information such as voter mood, perceptions, motives, and overall mind set were missing. Even if you collected quantitative data across a variety of sources, you don't get to these prescient indicators.
The new wave of AI (the next 2 - 5 years) makes capturing this insight possible and at scale. Marketing organizations are already using such capabilities to test advertising messages and positioning in focus group settings. But, if you took this a step further and allowed pollsters to ingest full discussions in person or through transcripts in research interviews, street polls, social media, news discussions and interviews, and other sources where citizen points of view manifest directly and indirectly to voting, that rich content translates into more accurate and insightful information.
To jump on this R feeding frenzy most leading BI vendors claim that they “integrate with R”, but what does that claim really mean? Our take on this – not all BI/R integration is created equal. When evaluating BI platforms for R integration, Forrester recommends considering the following integration capabilities:
I’ve been experimenting for the past year or so with several proactive assistant apps to guide my day — they remind me to get on conference calls with clients, offer to text participants if I'm running late to an in-person lunch, and keep me in touch with friends and colleagues. Some of these apps also integrate Salesforce, Yammer, and BaseCamp for job-specific context and assistance.
Among the most popular apps, Google Now personalizes recommendations and assistance by applying predictive analytics to data stored in email, contacts, calendar, social, docs, and other types of online services users opt in. Other examples include Tipbit applying predictive analytics to make a more intelligent inbox, and EasilyDo using the notification system to recommend ways to automate common everyday tasks. Expect Labs is tackling this space from the other end of the spectrum, offering an intelligent assistance engine for enterprises to plug into and add proactive features to their own apps.
Here’s what we think:
• Vendors will experience burnouts and early customer frustration, much like in voice recognition. In the music industry, it’s said that an artist is only as good as her last hit. We saw that analogy apply to voice recognition when users got frustrated at Siri as soon as she failed once on them. Expect a similar dynamic with all types of predictive apps.
We attended the recent Glimpse Conference 2013, where members of New York's tech scene came together at Bloomberg headquarters to talk about social discovery, predictive analytics, and customer engagement.
Our key takeaway from the event: small, real-time data coming from very personal apps like email, calendar, social, and other online services will fuel next-level predictive apps and services. Specifically:
• Better insight doesn’t require more data; it needs the right data. Amassing large databases of customer profiles, purchase history, and web browser activity only goes so far, and is costing companies millions, if not billions of dollars every year. Mikael Berner from EasilyDo sees a new opportunity in better utilizing data scattered across personal email indices, calendars, social networks, and file and content repositories that directly indicate customers’ plans, interests, and motivations.
• Email, calendar, and location data is a goldmine for predictive analytics. Expedia or TripAdvisor can track web activities to recall a user searched for hotels last November and is likely to travel again this year, but a flight confirmation sitting in email or vacation time logged in calendar is a much stronger indicator of travel plans.
Last week I had the privilege of participating on the Advisory Board for the Retail Marketing Analytics Program (ReMAP) at the University of Minnesota, Duluth (UMD). Perhaps the best part of these sessions is the opportunity to meet with the students, many of which will be tomorrow’s marketing scientists.
During a few conversations on this visit, I was asked how to secure an entry-level position that would involve lots of cool predictive analytics. I want to focus on one of the answers I shared — don’t tell anyone you’re doing predictive analytics. What do I mean? Imagine you’re a freshly minted analyst in the following situation:
Your manager asks you to quickly evaluate who responded to a promotion.
You have many factors to investigate (because you have lots of data).
You have very limited time to find a great answer and build a deliverable.
The required deliverable needs to be simple and free of analytic jargon.
SAP today announced plans to acquire KXEN, a provider of predictive analytics technology. The terms of the deal are not known. This is an interesting development for both companies and highlights the focus on the democratization of predictive analytics, especially for marketers. The proposed deal puts the spotlight on two shifts in the analytics landscape:
Expert user to casual user. Our research shows that finding top analytics talent is a key inhibitor to greater customer analytics adoption. As a result, users expect analytical tools to cater to nontechnical, nonstatistician business and marketing users.
Developers And Their Business Counterparts Are Caught In A Trap
They swim in game-changing new technologies that can access more than a billion hyperconnected customers, but they struggle to design and develop applications that delight customers and dazzle shareholders with annuity-like streams of revenue. The challenge isn’t application development; app developers can ingest and use new technologies as fast as they come. The challenge is that developers are stuck in a design paradigm that reduces app design to making functionality and content decisions based on a few defined customer personas or segments.
Personas Are Sorely Insufficient
How could there be anything wrong with this conventional design paradigm? Functionality? Check. Content? Check. Customer personas? Ah — herein lies the problem. These aggregate representations of your customers can prove valuable when designing apps and are supposedly the state of the art when it comes to customer experience and app design, but personas are blind to the needs of the individual user. Personas were fine in 1999 and maybe even in 2009 — but no longer, because we live in a world of 7 billion “me”s. Customers increasingly expect and deserve to a have a personal relationship with the hundreds of brands in their lives. Companies that increasingly ratchet up individual experience will succeed. Those that don’t will increasingly become strangers to their customers.
Buy analytics software, hire marketing scientists, and engage analytics consultants. Now wait for the magic of customer analytics to happen. Right?
Wrong. Building a successful customer analytics capability involves careful orchestration of several capabilities and requires customer insights (CI) professionals to answer some key questions about their current state of customer analytics:
What is the level of importance given to customer analytics in your organization?
Have you clearly defined where you will use the output of customer analytics?
How is your analytics team structured and supported?
How do you manage and process your customer data?
Do you have clear line of sight between analytics efforts and business outcomes?
What is the process of sharing insights from analytics projects?
What type of technology do you need to produce, consume and activate analytics?