Asking customers for feedback is one of the most direct ways to understand their experiences and needs across touchpoints. However, we’ve all experienced an organization’s attempt to execute this . . . usually poorly.
Surveys are too long. Callbacks are interruptive. What are they going to do with my feedback anyway?
Combatting these types of complaints is core to recent conversations with organizations who are establishing voice of the customer (VoC) programs. Some questions include: How do you ensure you are engaging with customers at the right time in the right channels, what is the main metric you are asking to ensure consistent data collection, and what is the best way to ask the question to encourage participation?
Recently I used Forrester's internal collaboration platform — Chatter — to collect stories about when colleagues were asked for feedback. I received a litany of the good, the bad, and the ugly of customer feedback designs. Below are the main takeaways from my internal and external conversations along with examples to consider as you think about the best way to collect information from your customers.
1. Make It Easy
Uber and a local food delivery service Peach make it easy to give direct feedback on a specific experience. They provide visual cues to remind customers what they are giving feedback on and a simple mechanism for providing it (a star rating). There are various ways of executing, including emojis (think Facebook’s recent updates) and scales (e.g., 1 to 5, 1 to 10). Any of these tactics work, as long as they align with your brand and are asked consistently across touchpoints. Both of these examples also provide an opportunity to give more feedback afterward to provide context to the rating.
One week ago today, we Bostonians enjoyed a picture-perfect opening day at Fenway Park. The sun was shining, temps finally warmed up after an abysmal winter, opening ceremonies paid tribute to local heroes like the Richard and Frates families,* and our beloved Red Sox beat the Washington Nationals 9 to 4.
What I love about opening day at Fenway is the optimism, the sense that anything is possible. A new season means a clean slate; the less-than-stellar 2014 baseball season is all but a distant memory.
The biggest change in our new approach is the way we judge CX excellence. To hit a home run, the 299 brands we studied had to do more than make customers happy. They had to design and deliver a CX that actually helps the business by creating and sustaining customer loyalty.
I’m not alone. Creating a superior and differentiated customer experience is a core strategy for most companies — a pillar of who you want to be. It’s likely in your mission statement, annual report, 10-K, strategy deck, or company culture declaration. In a Forrester survey, “improving the customer experience” was tied with “growing revenue” as the No. 1 business priority over the coming year. Great CX is the big ambition in the sky.
For many, it remains an ambition.
The feedback I get from executives is consistent with my own thinking and Forrester’s body of research in this area. CX can’t be an attitude, tagline, or one-time corporate initiative. It has to be a different way of doing business, a new kind of operating model.
That means addressing the complex areas like people, process, and culture.
At Forrester, I keep returning to the basics to help us take simple but important steps forward. Here are five observations from the frontlines:
Change your perspective. We have a sense of how customers are supposed to traverse different touchpoints and a sense of the experiences we want them to have. But that’s not the starting point. CX is about the customers, on their terms and in their voice. Sounds basic, but that fundamental reorientation requires a surprising level of tenacity and discipline.
Are you looking for a vendor or vendors to support your voice of the customer (VoC) program? Or are you reviewing your current VoC vendor(s)?
Selecting the right vendor or vendors can be hard! Why? The VoC vendor landscape is hard to decipher. There are many but relatively small vendors, and they rely on an interconnected network of partners, acquire each other at an impressive rate, and regularly expand into new spaces. And companies often already have a number of vendors they work with. In my recent webinar about VoC, most of the attendees had from three to five vendors that supported their VoC program in some shape or form.
But there are a few beacons to help orient you in your quest:
The VoC vendor market is an ecosystem. What vendors are the right “lid” for your “VoC program pot” depends entirely on your internal capabilities and the characteristics of your VoC program. We identified customer feedback management (CFM) platforms and VoC specialist vendors. CFM platforms support VoC programs with a robust set of capabilities that include feedback collection, integration of feedback with other data in a centralized data hub, analysis, reporting, and closed-loop action management. VoC specialists offer a subset of VoC platform vendor capabilities. Their areas of expertise range from surveying customers in order to generate measurement data to mining your unstructured feedback with text analytics, monitoring social media data, and consulting to help establish or evolve a VoC program.
Are you trying to take your current customer experience measurement to the next level?
Many of the customer experience professionals we talk to regularly are working on improving their customer experience measurement. You are probably one of them. You might be working on picking the right metrics, on connecting customer experience to business outcomes or to operational variables, on using data to improve the customer experience, or on getting traction for CX measurement in your organization. To conquer any or all of these challenges, you need a solid and well-founded customer experience measurement framework.
Allow us to paint a vision of the future for you: After interactions with your favorite companies, no one asks you how you liked those interactions. Your email inbox contains no requests for a few minutes of your time. No one asks you to wait on the phone line to answer a few questions. The word "survey" has vanished from your vocabulary.
You just bought something at your favorite store. You walk out with a skip in your step thinking about when you might wear this new purchase. You give into your compulsion to check your email on your smartphone, and there, waiting for you, is a survey from that very company asking about your experience. You groan, but you click on the link. The survey isn't formatted for your phone, so you have to pinch to zoom in and out. You don't understand the first question. Or the second one. Frankly, you don't really care. You close your browser window, curse the company and every other company that has ever asked you to complete a survey, and vow never to shop anywhere ever again.
I'm no doctor, but I'm confident in my diagnosis: You are suffering from survey fatigue.
You're not alone. Survey fatigue has even made it into pop-culture as a known malady, thanks to articles like this one in USA Today. It's no surprise that consumers are irked; most companies' customer experience measurement programs and voice of the customer programs rely on surveys for the necessary data. As a result, consumers are getting barraged with requests for feedback, and, really, it's just because companies have good intentions. They want to know how they're doing and how they can improve the experience.
If you're one of these survey-reliant companies, what can you do? I'm working on some research right now on that very topic with our new analyst, Maxie Schmidt-Subramanian. We're exploring indicators of survey fatigue to help you spot the problem as well as best practices for reducing any fatigue that does exist.
Marketing Manager: “Net Promoter Score is the one number we need to grow!”
Customer Intelligence Manager: “Nonsense! ‘Satisfaction’ predicts customer loyalty better than ‘likelihood to recommend’ – it says so in the wonky business journals I read!”
Marketing Manager: “You don’t understand how business works!”
Customer Intelligence Manager: “You don’t understand how math works!”
The sad thing is that in a micro sense they’re both right, but in a macro sense they’re both wrong. The reason? They’re each taking an inside-out point of view based on their own specialties.
Where NPS Fits In A Customer Experience Measurement Framework
In our research into customer experience measurement, we see many organizations that use Net Promoter Score. Some use it poorly because – like the fictional marketing manager above – they don’t understand the limitations of what NPS can do.
Here’s how they should think of it: Customer experience is how customers perceive their interactions with a company along each step of a customer journey, from discovery, to purchase and use, to getting service. NPS measures what customers say they’ll do as a result of one or more of those interactions. It’s what Forrester calls an “outcome metric.”
But outcome metrics are just one out of three types of metrics captured by effective customer experience measurement programs. The best programs gather and analyze:
Today we published Forrester’s 2012 Customer Experience Index (CXi). It’s our fifth annual benchmark of customer experience quality as judged by the only people whose opinion matters — customers. The CXi is based on research conducted at the end of 2011 and reflects how consumers perceived their experiences with 160 brands across 13 industries to be.
For those new to the index, let me explain how it works. The process has three steps:
We ask more than 7,600 consumers to identify companies they do business with in 13 different industries.
We ask them to tell us how well each firm met their needs, how easy the firm was to work with, and how enjoyable it was to work with. We ask these questions at the brand level to get a sense of their overall experience with the company regardless of channel.
For all three questions, we calculate each firm’s CXi score by subtracting the percentage of its customers who reported a bad experience from the percentage who reported a good experience. The overall CXi is an average of those three results.
For the past five years, I’ve been leading Forrester’s research on measuring customer experience. With the recent explosion of interest in customer experience overall and the perennial popularity of metrics as a topic within that space, we’ve decided to expand the team that covers it.
I’ll continue to write reports about general measurement best practices and how to apply them in an enterprise-level experience measurement program. My colleague Adele Sage is adding to that body of work by exploring how the latest experience measurement theory applies in digital channels like Web, mobile, tablets, and whatever new channel they dream up next. And in fact, she just published her first two reports in this research stream: