Why Surveys Aren't The Best Tool For Designing Experiences

While most design researchers and practioners would agree that surveys aren't the best tool for designing experiences, I'm still suprised that we get pushback on the value of other (primarily qualitative) research methods from customer experience professionals and of course their business colleagues. While many of these people will argue to the grave that surveys are "better" than qualitative research methods because they mitigate risk by being both quantifiable and statistically significant, they don't realize that when designing experiences, surveys introduce "risk" well before a survey is analyzed. How? Well, surveys:

  • Limit responses. Most surveys (whether they're open-ended or offer restricted responses) ask users for their reaction or input to a specific question or situation. If you're asking for something that's relatively black and white, that's a perfect technique. But if you're asking people to explain why they did (or didn't do) something or about the nuances of how they did something, or if you want to see how their context influences their behavior, then surveys are difficult to craft because you essentially have to know the answers before you ask the question. And if you don't know all of the right answers, then you're introducing risk by guessing what they may be.
  • Rely on respondants to know the truth. While most consumers (or at least some) don't lie purposely or at least not consistently enough to break a good survey, surveys often rely on consumers to tell you what they will or plan to do and whether they like/dislike or want/don't want something. While those are all important things to know, most people aren't fully aware of, don't remember, or are not able to predict their own behaviors in situations that they haven't been in before. So when trying to design a new interaction, what people tell you they will do may not always be a risk-free way to determine what they will actually do.

The analysis that goes into surveys can be impressive, and their responses can hold interesting trends and insights for customer experience professionals and their colleagues. But don't get so enamored by the analysis that you miss the fact that the technique had its own drawbacks to begin with. Qualitative methods might not come with the fancy charts and statistical analysis (although many social scientists might disagree), but when done right, they offer the most promising way we've seen to overcome the limitations that survey-based techniques impose when trying to design an experience.

If you're interested in learning more about some of these techniques or learning how they're used in other enterprises, join me for the Understand Your Customers track at Forrester's Customer Experience Forum. Speakers from Wells Fargo, Frog Design, and the Design and Usability Center at Bentley University will be sharing their real-world experiences.



Human experience and the bias of quantitative analysis

Thank you for your tastefully penned response to recent criticism of the methodologies and practices of companies such as Forrester and others - and I cannot agree more with your blog post. Heuristic and qualitative analysis provide beneficial, multi-faceted customer experience data across the multitude of communication platforms, UIs, contact centers, out of box experiences, and so on. Human interactions are nuanced and complex.

Surveys immediately bias the data away from experience and more towards influencing opinion. Although in the past surveys provided statistics and charts for PowerPoint decks and business plans, today surveys for user experience provide only some insight into how customers "feel." If you peek behind a customer experience survey-based analysis, often times statistical bias blemishes the best of intentions.

One example that I recall illustrates how bias can start even before the customer begins to check the boxes: the survey was provided to prepaid (non-contract) mobile users via email. Most of the demographic of the companies compared in the study didn't use email. The results were so biased that price alone seemed to be the real experiential care-about and all that mattered. Hurts consumers, markets, companies, and so on. That's an extreme example, I admit. But I am biased.

I believe there's a better way than established advisors and trusted voices taking public pot shots at competitive organizations. Cooperation and best practices should prevail across company lines rather than obvious stabs at trusted sources such as Forrester and Gartner.

Vidya, I agree with both of

Vidya, I agree with both of the points you make, and might venture a third risk of quant surveys: "They take the first answer as the truth". I have lost count of the number of times I have seen a respondent clearly frustrated or puzzled by the task they are attempting, only to give an answer that they thought it was a good process. This is because the user often blames themselves for the problems they are having rather than the poor design of the process...."I must have missed something", or "I think I got it wrong". Only on the skilled prompting of a moderator, does the real sense or scale of the problem get identified, and only then can the problem area be resolved.

I look forward to participating in the CX Forum, see you there.

This is very sound advice.

This is very sound advice. The truth is surveys often provide flawed results. To really decide how best to serve, managers and front line staff need to open their eyes and ears. As this video suggests (http://www.upyourservice.com/video-theater/how-can-you-make-the-best-of-...) there is always room for improvements.