The Online Panel Quality Debate

A few years ago, Procter & Gamble publicly stated that it had experienced inconsistent research results from successive online research projects. Other organizations shared similar experiences, and questions were raised about “professional respondents.” The trustworthiness of online research was in question, and multiple initiatives arose. In the past two years, we’ve seen a lot of debate around this topic, and associations such as ESOMAR and ARF have come up with protocols that all good panels should follow — and many have. But what does this mean from a client perspective? How have initiatives like ARF's Quality Enhancement Process, MarketTools' TrueSample, or processes like machine fingerprinting changed the industry?

Next month, I'm hosting a panel at Forrester's Marketing Forum 2010 with participants from Microsoft, Procter & Gamble, and the ARS Group to understand what the challenges with online sampling are today — and how they affect adaptability.

Questions I will discuss with the panel include the following:

Read more

Capturing The Unconscious

Last week, I attended Research 2010, the research conference organized by the UK's Research Organization. One session was on innovative research methodologies, and although it's not completely new to the industry, I was surprised to see two of the presentations covering research methodologies that capture people's unconscious behavior through technology.

The first was a presentation about lifelogging, or “glogging” for those in the know. Simply put, lifelogging documents somebody's life through technology worn by the “respondent.”

Bob Cook from Firefish presented how this technology helps researchers better understand the tradeoffs that people constantly make. Lifelogging has a long history, and it was started by Steve Mann. In the early 1980s, he walked around with recording gear that looked more like a suit of armor.

Read more

Categories:

Please yell at me for doing my job badly

Last Saturday, at the Silicon Valley Product Camp, I was part of a panel on PM metrics. Any topic that's at the same time important and unsettled keeps you thinking long after the panel, so not surprisingly, almost a week later, I'm still chewing on it. Here's an observation I'll make today, after further pondering:

You know when you're doing well as a PM when someone yells at you for getting a persona, user story, use case, or task analysis wrong.

Understanding the world from the standpoint of the individual buyer or user is one of the primary responsibility of PM. According to some schools of thought, it's the core responsibility, especially since no one else in a technology company is responsible for collecting, analyzing, and distributing these deep customer insights. (There are other core responsibilities, too, related to the company's business and the technology itself.)

That information may look academic, but it should be immediately pertinent in very important ways. Understanding the way in which people in a variety of roles assess, purchase, and adopt technology is critical for making smart decisions about everything from product design to the product roadmap, from crafting messaging to choosing marketing channels. Unless you live in a Soviet-style command economy, in which manufacturing 3,000 left shoes is a problem for the consumer, not the producer, customer insights need to inform both strategic and tactical decisions.

Read more