- log in
Posted by Joe Stanhope on February 8, 2013
I’m pleased to announce that we’ve published "The Forrester Wave™: Online Testing Platforms, Q1 2013." The Wave methodology is Forrester’s time-tested, exhaustive, and transparent approach to vendor evaluations. We base this research on data gathered through extensive vendor briefings, product demonstrations, customer reference calls, and online user surveys. We evaluated seven leading vendors against 53 criteria and gathered feedback from 132 user companies.
This Wave focused on established vendors that offer full featured online testing solutions targeted at enterprise clients. Based on this criteria, we evaluated the following companies: Adobe, Autonomy, Maxymiser, Monetate, Optimizely, SiteSpect, and Webtrends. Forrester clients can read the full report and access the underlying scorecard details for each vendor. And don’t forget that the Forrester Wave scorecard also includes an interactive tool allowing users to customize the Wave model with personalized criteria weightings.
We invest significant resources covering online testing at Forrester for two reasons. First, online testing is a sound technique for optimizing conversions and customer experiences. Second, and most importantly, optimization capabilities are a core component of Digital Intelligence, Forrester's comprehensive framework for digital analytics. Robust experiments cut through noise and assumptions to help customer intelligence professionals evaluate which content, promotions, and experiences drive success metrics. In short, optimization is a key driver of actionable analytics.
The online testing platform vendor category has evolved significantly since we did this evaluation two years ago. What did we learn?
- The online testing market has room for new vendors. The market for online testing technology is highly active; at Forrester, we track around 20 vendors that serve enterprise-class clients. Competition is a positive force that drives better products and services for users. It's encouraging to see new entrants in the evaluation this year.
- Online testing is about more than the website. Online testing programs must follow the ongoing digital marketing shift beyond fixed internet websites to support multiple channels such as social media, mobile websites, and applications to optimize the full customer experience.
- Innovation is alive and well. Online testing platforms vendors are responding to expanding user requirements with support for multichannel testing, visitor profiling, advanced experiment scenarios, and automation features.
The future is bright for online testing, but some struggles persist. One such stubborn unresolved challenge is the rationalization of diverse user needs. Users tell us that they want self-service capabilities, but at the same time, organizations rely heavily on third-party services to support their online testing programs. It's hard to be good at everything, and online testing vendors are in the difficult position of serving two masters:
- Democratized testing. Delivering efficient online testing platforms that permit business users to create, deploy, and analyze experiments. To their credit, several vendors are making legitimate progress with application user interfaces that make this possible. The inherent risk to this approach is putting high-powered online testing capabilities without appropriate guardrails into the hands of users who may not fully understand the underlying principles.
- Advanced testing. Many sophisticated users now demand advanced testing capabilities for their highly skilled optimization analysts. Vendors are addressing this user segment with new experiment functionality, data collection and access options, services support, and custom reporting capabilities. The risk to this approach is that experiments become unwieldy and resource intensive, limiting organizations ability to run enough profitable tests to reach scale.
We'll continue to monitor these developments to determine whether online testing vendors can successfully manage multiple priorities. Another strong possibility may be the emergence of specialty vendors catering to various use cases.
But that’s not all! We are currently writing new research to evaluate reference client feedback on online testing vendors. And in Q1, Forrester will launch the Digital Intelligence playbook.
Finally, the Wave process is a collaborative effort. I’d like to thank the vendors and users who participated in the evaluation for their time, effort, and commitment. Additionally, I’d like to extend a special “Thank You!” to my Forrester colleagues Dave Frankland and Allison Smith, who supported the project every step of the way.
Search Forrester's Blogs
Forrester Insights for iPhone
Key research and data points when and where you need them »
Forrester's CX Index
Predict how actions to improve CX will affect revenue performance.
Measure the customer experiences that matter most »
Key Takeaways From Mobile World Congress »
- Carlton Doty (5)
- Clement Teo (8)
- Cliff Condon (4)
- David Truog (2)
- Emily Collins (1)
- Erna Alfred Liousas (9)
- Fatemeh Khatibloo (1)
- James McQuivey (1)
- Jennifer Wise (8)
- Jessica Liu (3)
- Jim Nail (27)
- Laura Ramos (64)
- Lori Wizdo (1)
- Luca Paderni (11)
- Melissa Parrish (48)
- Michael Barnes (1)
- Peter O'Neill (3)
- Rebecca McAdams (2)
- Richard Joyce (4)
- Rob Brosnan (1)
- Rusty Warner (2)
- Ryan Skinner (38)
- Samantha Merlivat (1)
- Samantha Ngo (1)
- Sarah Sikowitz (1)
- Shar VanBoskirk (115)
- Susan Bidel (3)
- Thomas Husson (136)
- Tina Moffett (2)
- Xiaofeng Wang (32)