Good news for those of you requesting extensions: We heard you, and we're extending the deadline for Forrester's Voice Of The Customer Award submissions to Friday, April 6th at 5:00 p.m. ET.
While I have you, here are answers to some of the questions I've been getting about the awards:
I'm a vendor. Can I still apply? Yes — but only if your submission is about your own VoC program. We don't accept submissions from vendors on behalf of their clients.
Does my company have to be headquartered in North America? No! This year we've gone global! We'll accept any submission, as long as it's written in English.
Will you honor confidentiality? Yes! No matter what, we'll publish the names of the 10 finalists and three winners. But any specifics that we want to publish beyond that, we'll fact-check with you first.
Do I have to be a Forrester client? No! We'd love to hear from you whether you're a client or not.
Does the cover page count toward the page limit? No, we're only asking you to limit the content of the submission to seven pages.
Can I get an extension? You already did! And no, we won't be offering any extensions beyond Friday, April 6th.
In the industries we modeled, the revenue benefits of a better customer experience range from $31 million for retailers to around $1.3 billion for hotels and wireless service providers.
What’s behind these impressive numbers? It’s pretty simple, really.
Companies with better customer experience tend to have more loyal customers. We’ve shown through both mathematical correlations and actual company scores that when your customers like the experience you deliver, they’re more likely to consider you for another purchase and recommend you to others. They’re also less likely to switch their business away to a competitor. These improved loyalty scores translate into more actual repeat purchases, more prospects influenced to buy through positive word of mouth, and less revenue lost to churn.
We model the size of the potential benefit using data from real companies. In each industry, we create an archetypal “ACME Company” that scores below industry average in the Customer Experience Index (CXi). We then look at what would happen to ACME’s loyalty scores if it went from below average in the CXi to above average for its specific industry based on the actual scores for companies in that industry.
Once again, I'm going to write an overview of the European interactive design agency market to help Forrester clients identify design agencies to help them with their projects in Europe. The report title will be "2012: Where To Get Help For Interactive Design Projects In Europe." Participants will receive a copy of the research and their details will be included in the report.
I would like to invite interactive design agencies in Europe to participate. Please complete the agency survey at the following location:
The survey is designed to gather data from European firms that have significant experience in designing and developing digital experiences (web, mobile, etc.). Survey questions cover interactive agency size, practice areas, industry expertise, locations, and a range of costs for typical engagements. If you know any agencies that should be included in my report, please forward the survey link to them or show them this blog post.
P.S. If you want a preview of the survey, you can see all the questions on the following site:
In the two months since I published "The Customer Experience Index, 2012," the number of companies requesting a deeper look at the data has been quite high. Many have asked me to suggest ways to use the information that’s available, so I thought I’d share the analyses I've found most interesting so far:
Compare Customer Experience Index (CXi) respondents to your company’s target customer profile. As part of the CXi survey, we collect a range of demographic data including age, gender, marital status, household income, employment status, parental status, and location. Clients find it helpful to see if differences between our scores and their internal data stem from the fact that we’re surveying different populations. They’re also using it to think through why scores on a given criteria are what they are — for example, if most respondents for a TV service provider have small kids, the firm’s parental controls may have a bigger impact on the “meets needs” score than they would if most respondents had grown children.
After more than 12 years of evaluating website user experience, Forrester reached a major milestone — completing 1,500 Website User Experience Reviews. That's more than 100 reviews per year or more than 10 per month. Whew! We've been busy.
These reviews (using an expert/scenario/heuristic review methodology) span B2C and B2B sites, intranets, and employee portals across many industries and countries. What we do: We identify target users and attempt to accomplish realistic user goals for those users, and then we evaluate the experience on a set of 25 criteria graded across possible scores of -2 (severe failure), -1 (fail), +1 (pass), or +2 (best practice) for each criterion.
Many poor experiences. Since scores for each of the 25 criteria range from a -2 to +2, total scores could range from -50 to +50, and passing all tests would result in a grade of +25 or higher. But the average score across all of our reviews was only +1.1, and only 3% of the sites earned a passing score (that's a total of 45 sites out of the 1,500. Yes, you read that right: 45).
Fluctuations in scores over time. The average score rises and falls when we look across versions of the methodology and over time. But, finally, in the latest version, there was a significant increase in the average score over the years just prior — a trend we hope to see continue. There's a similar pattern when we compare B2C and B2B sites. B2B sites have consistently lagged behind B2C sites in user experience scores, but we're finally seeing that gap narrow.
In our continuing research on the emerging role of the chief customer officer (CCO), we recently looked at the kinds of authority their firms vest in them to drive change across the organization. This authority can affect the activities they do, the composition of the teams that report into them, and the budgets they control. For firms considering putting this kind of senior customer experience leader in place, Forrester has identified three archetypal models that characterize the most typical modes in which CCOs operate.
Advisory CCOs Play A Coaching Role
Companies that are early in their customer experience transformations are often reluctant to commit too many resources or cede control of core company processes to a CCO. These firms tend to place CCOs in an advisory or coaching role for peers with operational responsibilities, particularly if the company has had past success with centralized teams to drive change management efforts. CCOs running these teams have little control over decision-making and execution and instead derive authority through their expertise and personal reputation within their companies. A mandate from senior leadership in a business unit, the executive management team, or the CEO bolsters these CCOs' ability to change behaviors in other departments. These CCOs:
Build core capabilities and spread awareness. Because they don't directly control operations, advisory CCOs and their teams focus on building core foundational customer experience capabilities and standards as would a center of excellence.
The Holy Grail of customer experience for many firms goes beyond useful and easy to interactions that create an emotional connection with the customer. That’s not easy to do, but step 1 is creating an experience that is at least enjoyable. Now, before you object . . . I’m not talking Disney-level enjoyable here — just generally pleasant and maybe even a little fun. Two brands that proved it’s possible with high scores on the CXi’s “enjoyable” criteria are: