Last week, I was in LA, hosting a session on online panel quality at Forrester’s Marketing Forum. I discussed the past, present, and future of online panel quality with Steve Schwartz from Microsoft, Maria Cristina Gomez from Procter & Gamble, and Frank Findley from ARS Group.
Online panel quality is still a major issue in the industry. The whole discussion started in 2006 with a speech by Kim Dedeker -- at that time, the VP of global consumer and market knowledge at Procter & Gamble. In it, she publicly expressed her concerns about online panel quality, how it affected their research results, and, as a result, the credibility of market research. In her speech, she stressed that, in her opinion, the industry – both research suppliers and clients – needed to focus on how to improve the overall quality of research. Her appeal to the industry was very successful. Many other research buyers weighed in with their stories, and the research providers took up the challenge. Since then, many initiatives have started, such as the ARF’s Foundation of Quality and ESOMAR’s 26 questions, as well as more technology-driven approaches like Peanut Labs’ Optimus and MarketTools’ TrueSample.
A few years ago, Procter & Gamble publicly stated that it had experienced inconsistent research results from successive online research projects. Other organizations shared similar experiences, and questions were raised about “professional respondents.” The trustworthiness of online research was in question, and multiple initiatives arose. In the past two years, we’ve seen a lot of debate around this topic, and associations such as ESOMAR and ARF have come up with protocols that all good panels should follow — and many have. But what does this mean from a client perspective? How have initiatives like ARF's Quality Enhancement Process, MarketTools' TrueSample, or processes like machine fingerprinting changed the industry?
Last week, I attended Research 2010, the research conference organized by the UK's Research Organization. One session was on innovative research methodologies, and although it's not completely new to the industry, I was surprised to see two of the presentations covering research methodologies that capture people's unconscious behavior through technology.
The first was a presentation about lifelogging, or “glogging” for those in the know. Simply put, lifelogging documents somebody's life through technology worn by the “respondent.”
Bob Cook from Firefish presented how this technology helps researchers better understand the tradeoffs that people constantly make. Lifelogging has a long history, and it was started by Steve Mann. In the early 1980s, he walked around with recording gear that looked more like a suit of armor.
One of the key themes I saw popping up in 2009 was the need for market researchers to communicate insights instead of information (or even worse: data). I've been at a number of events where this was discussed and I followed multiple discussions in market research groups like for example Next Generation Market Research (NGMR) on LinkedIn. Personally I added to this discussion by publishing a report called The Marketing Of Market Research - Successful Communication Builds Influence.
The general consensus is that market researchers should stay away from elaborating on the research methodology and presenting research results with many data heavy slides and graphics. Instead, they should act more like consultants: produce a presentation that reads like an executive summary (maximum 20 slides or so) and starts with the recommendations. The presentation should show the key insights gained from the project, cover how these results tie back to business objectives, include alternative scenarios and advice on possible next steps.
However, another consensus from the conversations is that not all market researchers are equally well equipped to deliver such a presentation, where they're asked to translate data into insights, come up with action items, and tell a story. Most participants in the discussions agreed with the statement that the majority of market researchers still feels most comfortable when they present research outcomes (aka numbers).
By now, most of you know my love for infographics. A colleague recently pointed me to this great tool of the world bank: The World Bank Data Visualizer.
It has it all: data for 209 different countries, trending, and customizable axes. This is a great tool for everyone who's doing global research and wants to know more about the countries researched, and how they relate to each other.
Recently I was asked by Research Magazine to contribute to an article about market research in 2010. The caveat: I was only allowed ONE word to describe what I saw as the most important change, trend or force affecting market research in 2010.
In hindsight, 2009 marked a turning point for the market research industry, when technology and innovation became part of the ongoing discussion on how to move the industry forward while balancing the realities of a business world in a recession.
A couple of weeks ago I published a post called 'The Future Of Research: Building A 3-Dimensional View Of The Customer'. The summary of my post was that consumers connect with companies through different channels and leave their feedback about the company in different places. They expect companies to understand that and they don't want to be asked about things they already shared.
In the past year I've spend quite some time looking into innovative research methodologies. One methodology that really has won over my heart is mobile research1 (see my report The challenges and opportunities of mobile research for full details). The anytime anywhere aspect of the mobile phone, combined with people's emotional attachment to it, makes it an ideal device for people to share their thoughts and opinions in a research context.