Lessons Learned From 1,500 Website User Experience Reviews

After more than 12 years of evaluating website user experience, Forrester reached a major milestone — completing 1,500 Website User Experience Reviews. That's more than 100 reviews per year or more than 10 per month. Whew! We've been busy.

These reviews (using an expert/scenario/heuristic review methodology) span B2C and B2B sites, intranets, and employee portals across many industries and countries. What we do: We identify target users and attempt to accomplish realistic user goals for those users, and then we evaluate the experience on a set of 25 criteria graded across possible scores of -2 (severe failure), -1 (fail), +1 (pass), or +2 (best practice) for each criterion.

So what did we find?

  • Many poor experiences. Since scores for each of the 25 criteria range from a -2 to +2, total scores could range from -50 to +50, and passing all tests would result in a grade of +25 or higher. But the average score across all of our reviews was only +1.1, and only 3% of the sites earned a passing score (that's a total of 45 sites out of the 1,500. Yes, you read that right: 45).
  • Fluctuations in scores over time. The average score rises and falls when we look across versions of the methodology and over time. But, finally, in the latest version, there was a significant increase in the average score over the years just prior — a trend we hope to see continue. There's a similar pattern when we compare B2C and B2B sites. B2B sites have consistently lagged behind B2C sites in user experience scores, but we're finally seeing that gap narrow.
Read more