Many of the conversations I have with clients about voice of the customer (VoC) programs center on ways the programs can improve and best practices they can adopt. What I think is really underlying these discussions, though, is the question, "How does my program compare with all the others that are out there?" Or, more succinctly, "How am I doing?"
My anecdotal conversations, though frequent, do not make for a quantitative study. So I did just that: I surveyed our Global Customer Experience Peer Research Panel about their VoC programs. The results will be published shortly in a Forrester report called, "The State Of VoC Programs, 2012," but in the meantime, I'd like to give you a sneak peak.
Our most important finding was that customer experience professionals aren't getting the value they could be from their programs. Specifically, we asked how valuable their programs were in improving customers' experiences and how valuable they were in delivering financial results. It turns out that VoC programs help companies improve the customer experience; we saw more respondents getting that kind of value. But firms struggle to connect the dots to financial value.
So why the gap? It turns out that customer experience value is pretty easy to recognize. Respondents told us that the feedback data they collect helps them identify problems with the experience that need to be fixed. It also helps them prioritize what to fix because they can take the input from their customers into account when looking at all the various improvement project opportunities. The resulting projects make the experience better.
We just announced the winners of Forrester’s 2012 Voice Of The Customer Awards (VoC) at our Customer Experience Forum this afternoon. We received roughly 40 nominations, and yet again, we were incredibly impressed with the breadth and depth of the submissions. We broadened our scope, too: For the first time, we accepted nominations from around the world.
To evaluate the submissions, each of our three judges graded each nomination based on five criteria: clarity of approach, impact on customers’ experiences, impact on business performance, degree of innovation, and lessons provided for other firms. The nominees with the 10 best scores were named finalists. The top three scorers were named winners.
Where will you be at 5:20 p.m. on Tuesday, June 26th? I know where I’ll be: announcing the winners of Forrester’s annual Voice Of The Customer Awards at Forrester’s Customer Experience Forum in NYC!
We’ve read through the incredible submissions, graded them, picked our finalists, and picked our winners. And I’ve been busy for the past few weeks organizing everything for the big announcement on June 26th and the Forum track session panel featuring the three winners that’s happening on Wednesday, June 27th, at 1:20 p.m.
Yes, you heard right. This year I’m not just announcing the winners on Day 1; I’m also moderating a track session where you can hear more about the winning programs. First, the companies’ representatives are going to describe the types of improvements they’ve made and the business results associated with those improvements, and then I’m going to open up the floor to Q&A. This is your chance to ask the burning questions about what makes these programs so successful.
Good news for those of you requesting extensions: We heard you, and we're extending the deadline for Forrester's Voice Of The Customer Award submissions to Friday, April 6th at 5:00 p.m. ET.
While I have you, here are answers to some of the questions I've been getting about the awards:
I'm a vendor. Can I still apply? Yes — but only if your submission is about your own VoC program. We don't accept submissions from vendors on behalf of their clients.
Does my company have to be headquartered in North America? No! This year we've gone global! We'll accept any submission, as long as it's written in English.
Will you honor confidentiality? Yes! No matter what, we'll publish the names of the 10 finalists and three winners. But any specifics that we want to publish beyond that, we'll fact-check with you first.
Do I have to be a Forrester client? No! We'd love to hear from you whether you're a client or not.
Does the cover page count toward the page limit? No, we're only asking you to limit the content of the submission to seven pages.
Can I get an extension? You already did! And no, we won't be offering any extensions beyond Friday, April 6th.
After more than 12 years of evaluating website user experience, Forrester reached a major milestone — completing 1,500 Website User Experience Reviews. That's more than 100 reviews per year or more than 10 per month. Whew! We've been busy.
These reviews (using an expert/scenario/heuristic review methodology) span B2C and B2B sites, intranets, and employee portals across many industries and countries. What we do: We identify target users and attempt to accomplish realistic user goals for those users, and then we evaluate the experience on a set of 25 criteria graded across possible scores of -2 (severe failure), -1 (fail), +1 (pass), or +2 (best practice) for each criterion.
Many poor experiences. Since scores for each of the 25 criteria range from a -2 to +2, total scores could range from -50 to +50, and passing all tests would result in a grade of +25 or higher. But the average score across all of our reviews was only +1.1, and only 3% of the sites earned a passing score (that's a total of 45 sites out of the 1,500. Yes, you read that right: 45).
Fluctuations in scores over time. The average score rises and falls when we look across versions of the methodology and over time. But, finally, in the latest version, there was a significant increase in the average score over the years just prior — a trend we hope to see continue. There's a similar pattern when we compare B2C and B2B sites. B2B sites have consistently lagged behind B2C sites in user experience scores, but we're finally seeing that gap narrow.
It’s that time of year again. We’re already in the midst of planning our annual Customer Experience Forum, and now we’re gearing up to collect and evaluate nominations for our Voice Of The Customer Awards — which we’ll present at the Forum.
If you’re new to the awards, here’s some background: Forrester's annual Voice Of The Customer Awards recognize organizations that excel in collecting, analyzing, and acting on feedback from their customers, incorporating customer insights into everyday decisions. We conduct the awards for three basic reasons: 1) to emphasize the importance of voice of the customer (VoC) programs; 2) to celebrate organizations that are leading the way; and 3) to highlight best practices.
If you (or, if you’re a vendor, your clients) have a strong VoC program, we encourage you to participate. It's free and it offers a great opportunity to earn some solid PR while sharing your wisdom with other customer experience pros. Also, we only reveal the names of the finalists and winners, so the potential downside is limited.
You can find all of the information you need on our VoC Award home page. The 2012 nomination form will become available there on March 5th. In the meantime, you can review this year's timeline, get answers to FAQs, and check out information about past winners.
In the US alone, Forrester is forecasting nearly 100 million smartphones by the end of 2011. And digital customer experience professionals are meeting the new mobile demand by creating or redesigning mobile experiences: 34 of the 48 customer experience professionals we surveyed at the end of last year said that they’re planning major mobile design projects in 2011.
In the rush to create great mobile experiences, most end up focused only on what occurs within the browser/app experience. But we know that consumers often call the call center when they can’t accomplish their goal on the Web. And that transition isn’t always seamless.
Let’s say we have a customer using a mobile banking app to look up the balance on his mortgage. Once he sees how much is left, he wonders what his options are to refinance at a better interest rate. He can get some basic refi rates in the app, but he wants to know whether, as a longtime customer, he can get a better rate. He goes to the "Contact Us" screen in the app and clicks on the phone number.
What happens next? He starts at the top of the IVR. He has to identify himself all over again and route to an appropriate agent. Talk about a frustrating experience for the customer and a waste of time for the agent to recapture what he was doing!
Remember: A smartphone is also a phone.
If the browser or app experiences are built for seamless transitions to phone agents, they should:
Oh, look what came in the mail yesterday: The order I tried desperately to cancel last week. But, no, UPS dropped it off, and the packing slip said nicely, “Thank you for your order! We are committed to ensure [sic] your experience exceeds your expectations.” Well, you failed.
Let me start from the beginning.
You see, I’m working on reviews for the latest “Best And Worst Of Website User Experience” report (check out last year’s report if you’re curious), and this year we’re evaluating the user experience at the top four tablet manufacturers’ sites. Instead of actually ordering brand new tablets, we are substituting an inexpensive accessory, completing the checkout process, and then immediately canceling the order so that nothing ships and no cards get charged. All went fine in canceling three of the orders, but the fourth, from a company that shall remain nameless, proved more difficult.
Here are all the steps I took to try to cancel the order:
I tried chat. I went to the “Help” page on the site and found listed in the contact info section a link to chat and a phone number. I initiated the chat and reached an agent, but the conversation was very slow (about 20 lines of communication in 15 minutes), the rep was hard to understand, and she couldn’t help me. She told me to call 1-800-[company].
I tried the website itself. I could check order status very easily on the site, but the info just told me the status (“In process”) and provided no contact information in context for order questions.
Forrester surveyed US consumers about their satisfaction with Web-to-store and store-to-Web transitions in three retail segments — apparel/accessories/footwear, consumer electronics, and wireless phones and service.
The results: Satisfaction with both Web-to-store and store-to-Web shopping is low.
Consumer electronics: 66% satisfied with Web-to-store shopping, and 55% satisfied with store-to-Web shopping.
Apparel/footwear/accessories: 60% satisfied with Web-to-store shopping, and 53% satisfied with store-to-Web shopping.
Wireless products and services: 54% satisfied with Web-to-store shopping, and 48% satisfied with store-to-Web shopping.
A few months ago, I asked for your input on our Web Site Review methodology. Harley Manning, Rich Gans, and I incorporated your feedback, scoured the latest academic and human factors research, and reflected on the past 1300+ reviews we've completed. And the result? The latest and greatest version (version 8.0 to be exact), officially renamed Forrester's Web Site User Experience Review 8.0.
What is it? Forrester's Web Site User Experience Review uncovers flaws that prevent users from accomplishing key goals on Web sites. It's is an expert evaluation, a type of methodology - also known as a heuristic evaluation or scenario review - that was originally developed by Rolf Molich and Jakob Nielsen as a lower-cost alternative to lab-based usability techniques.
How does it work? The review process begins by identifying the target users and their goals on the particular site. Armed with this information, a trained reviewer emulates the user and tries to accomplish specific goals on the site. The experience is then graded against 25 criteria. Scores for each criterion range from -2 (severe failure) to +2 (best practice), so overall scores for completed Web Site User Experience Reviews range from -50 to +50, with +25 representing a passing score.