Please yell at me for doing my job badly

Last Saturday, at the Silicon Valley Product Camp, I was part of a panel on PM metrics. Any topic that's at the same time important and unsettled keeps you thinking long after the panel, so not surprisingly, almost a week later, I'm still chewing on it. Here's an observation I'll make today, after further pondering:

You know when you're doing well as a PM when someone yells at you for getting a persona, user story, use case, or task analysis wrong.

Understanding the world from the standpoint of the individual buyer or user is one of the primary responsibility of PM. According to some schools of thought, it's the core responsibility, especially since no one else in a technology company is responsible for collecting, analyzing, and distributing these deep customer insights. (There are other core responsibilities, too, related to the company's business and the technology itself.)

That information may look academic, but it should be immediately pertinent in very important ways. Understanding the way in which people in a variety of roles assess, purchase, and adopt technology is critical for making smart decisions about everything from product design to the product roadmap, from crafting messaging to choosing marketing channels. Unless you live in a Soviet-style command economy, in which manufacturing 3,000 left shoes is a problem for the consumer, not the producer, customer insights need to inform both strategic and tactical decisions.

Yet both product managers and product marketers complain that they don't get enough time to do this research. When they can focus on these insights, the rest of the organization often doesn't know what to do with them. Some industry trends (for example, Agile, social media, and SaaS) have made attention to the customer a larger priority, but we're not yet living in a world where, unambiguously, for the vast majority of PMs, deep customer insight is the foundation of their jobs. Managing the enhancement list is not the same as understanding the reasons behind those requests.

Here's my simple metric for knowing when deep customer insights are an important part of a PM's job:

  • Your company wants you to dedicate time regularly to this work.
  • The results, in whatever form they take (personas, use cases, etc.), become a criterion for product, service, and portfolio decisions. I'm thinking about both inbound activities, such as requirements, and outbound ones, like designing demos.
  • The company knows when your insights contain errors, traced from your analysis to some business outcome (poor adoption of a feature, poor results from a marketing campaign, etc.).
  • Your company knows when, realistically, errors are your fault, and when they're an unavoidable part of doing research.

That last point is critical. Mistakes will happen, in part because deep customer insights take time to develop. Meanwhile, work continues, often unavoidably based on flawed or incomplete analysis. As long as your company understands this learning process, and supports your ongoing work, you're in a good position. When co-workers understand the value of these insights, and are impatient when they're not perfect, you're in a great position.


Examples of the yelling we'd want?

Spot on... these would be great metrics. I'm struggling, though, to identify any companies where #3 or #4 happen. Extending your thought:

[1] We seem to have PM delivery metrics (i.e. "did Engineering get requirements from product mgmt on time") rather than content quality metrics. Expectations are extremely low when success is bringing *any* requirements or personas or market insights to the table. Dev teams are willing to do their own 'market of one' interviews and product showcases precisely because they don't often see real insight within requirements, don't perceive this to be very difficult, and are used to going without customer input.

[2] Evaluations of PM customer insight tend to be infrequent, informal, and limited to where strong product mgmt leadership exists (seasoned Dir/VP of PM managing multiple PMs). It's hard to grade your own performance without peers who study similar customers/segments, so PMs working solo don't have infrastructure or incentive to be self-critical.

[3] For new products, the lag between ideation/customer input/design/development and actual revenue is very long, and we tend to use product delivery (i.e. shipments and revenue) as a late proxy for good customer insight. "We won't know if we're right until beta." This is barn-door-ex-post-horse thinking. Or classic post-course correction, .

Managing enhancement list not the same as understanding why...

Nice post, Tom. Understanding the customer's world is indeed a core responsibility of PM teams. Often (I think mainly due to lack of time) PMs end up spending a lot of time on the "enhancement list.. not... the reasons behind those requests."

I expanded on this in my post: