Voice Of The Customer Awards 2012 — Deadline Extended To Friday, April 6th!

Good news for those of you requesting extensions: We heard you, and we're extending the deadline for Forrester's Voice Of The Customer Award submissions to Friday, April 6th at 5:00 p.m. ET.

While I have you, here are answers to some of the questions I've been getting about the awards:

  • I'm a vendor. Can I still apply? Yes — but only if your submission is about your own VoC program. We don't accept submissions from vendors on behalf of their clients.
  • Does my company have to be headquartered in North America? No! This year we've gone global! We'll accept any submission, as long as it's written in English.
  • Will you honor confidentiality? Yes! No matter what, we'll publish the names of the 10 finalists and three winners. But any specifics that we want to publish beyond that, we'll fact-check with you first.
  • Do I have to be a Forrester client? No! We'd love to hear from you whether you're a client or not.
  • Does the cover page count toward the page limit? No, we're only asking you to limit the content of the submission to seven pages.
  • Can I get an extension? You already did! And no, we won't be offering any extensions beyond Friday, April 6th.

Get more details on the Forrester VOC Awards on our site.

Good luck!

Lessons Learned From 1,500 Website User Experience Reviews

After more than 12 years of evaluating website user experience, Forrester reached a major milestone — completing 1,500 Website User Experience Reviews. That's more than 100 reviews per year or more than 10 per month. Whew! We've been busy.

These reviews (using an expert/scenario/heuristic review methodology) span B2C and B2B sites, intranets, and employee portals across many industries and countries. What we do: We identify target users and attempt to accomplish realistic user goals for those users, and then we evaluate the experience on a set of 25 criteria graded across possible scores of -2 (severe failure), -1 (fail), +1 (pass), or +2 (best practice) for each criterion.

So what did we find?

  • Many poor experiences. Since scores for each of the 25 criteria range from a -2 to +2, total scores could range from -50 to +50, and passing all tests would result in a grade of +25 or higher. But the average score across all of our reviews was only +1.1, and only 3% of the sites earned a passing score (that's a total of 45 sites out of the 1,500. Yes, you read that right: 45).
  • Fluctuations in scores over time. The average score rises and falls when we look across versions of the methodology and over time. But, finally, in the latest version, there was a significant increase in the average score over the years just prior — a trend we hope to see continue. There's a similar pattern when we compare B2C and B2B sites. B2B sites have consistently lagged behind B2C sites in user experience scores, but we're finally seeing that gap narrow.
Read more