Posted by Joseph Stanhope on August 13, 2010
Online testing has consistently been top of mind with Forrester clients since my earliest days with the company at the beginning of the year. Clients are a major driver in composing my research agenda, and online testing shot to the top of the list. Clearly, the market had many unanswered questions about online testing and it was time to do a deep dive.
To anchor a new stream of research covering online testing, we’ve just published The Forrester Wave: Online Testing, Q3 2010. If you’re new to Forrester’s research, the Wave methodology is Forrester’s time tested, exhaustive, and transparent approach to vendor evaluations. This research is based on data gathered through extensive vendor briefings, product demonstrations, customer reference calls, and online user surveys. We evaluated eight leading vendors against 82 criteria and interviewed nearly 90 user companies.
This is Forrester’s inaugural evaluation of online testing vendors. This Wave focused on established vendors who offer products that support both A/B and Multivariate testing techniques. We evaluated the following companies: Adobe, Amadesa, Autonomy, Google, Maxymiser, SiteSpect, Vertster, and Webtrends.
Forrester clients can read the full report to see how the vendors ranked, including underlying scorecard details and the ability to customize the Wave model with personalized weightings.
We found a diverse market of vendors that are differentiated by several key markers that serve as crucial considerations for online testing programs:
- Services. Vendors offer a spectrum of services to support clients spanning fully managed services to strictly do-it-yourself approaches.
- Application usability. Attention to application user interface design is undergoing a renaissance as vendors increasingly design their tools to be used by marketers, but some applications are still very technically driven.
- Integration support. Users who intend to leverage online testing applications alongside other marketing and analytics tools need to pay attention to the integration options offered by and ensure that the desired level of connectivity is supported.
- Testing algorithms. Users have varying requirements for statistical rigor and availability of specific algorithms.
Stay tuned! This is a vibrant and rapidly evolving market. I am continuing to cover the entire site optimization market and am interested in all facets of this space. There are many specialist and emerging vendors in addition to those covered in the Wave, each with an interesting perspective and approach. My future research in this space will delve deeper into the processes and organizational requirements that drive successful online testing.
If you’re a vendor in this market please keep in touch, I’d love to hear about what you’re doing and where you’re going. And if you’re a user I definitely want to hear about your experiences to-date and plans for the future.
Finally, producing research of this magnitude is a collaborative effort. I’d like to thank the vendors and users who participated in the process for their time, effort, and commitment. Additionally, I’d like to extend a special “Thank You!” to my Forrester colleagues Dave Frankland, Emily Murphy, and Michael Grant who supported the project every step of the way.
Search Forrester's Blogs
Forrester's CX Index
Predict how actions to improve CX will affect revenue performance.
Measure the customer experiences that matter most »
Forrester's Forum For Customer Experience Professionals
June 16-17, 2015 — New York »
Free On-Demand and Live Events
Latest events from Forrester analysts, online and in person »