Posted by Tom Grant on September 24, 2010
Now that we've posted the outline for our study of thought leadership in the technology industry, it's a good time to take stock of our success so far. It's important to start the retrospection process, even if we're not completely done with the project yet, because we're using this study to pilot a different way of doing research. As we said in the document explaining this approach, Agile Research Development, we set the following goals (shown in order of priority):
- Ask the right questions.
- Enhance the quality of the answers.
- Maximize the voice of the customer.
- Make adjustments quickly.
- Be even more relevant to our clients.
I'm confident, based on the work so far, that we already hit our #1 goal, asking the right questions. Why that particular objective is at the top of the list, and the challenges we face in achieving it, is worth a little discussion.
Contrary To Popular Opinion, Analysts Are Not Paid Blabbermouths
As part of our role-based strategy, we keep a careful eye on the "success imperatives" for our roles, such as technology product manager/marketer. But what's a success imperative for an industry analyst? "Asking the right questions" is certainly on the list. Good research starts with a timely, important question that leads to useful answers.
For someone like me, with an innate curiosity about practically everything, it's not always easy to get to the right question, at least not right away. There are plenty of interesting topics -- interesting to me, at least -- but they might not be immediately relevant to our readership. Big ideas are still worth writing down, but it's important to realize that they may not affect anyone's to-do list for quite a long time.
The difference between what's interesting and what's immediately relevant is a well-understood part of the history of science. For example, the old PBS series Connections explained how a single invention might be a catalyst for other inventions. Often, these connections move in unexpected directions, and the gaps between inventions can be measured in the hundreds of years. For example, in one episode, the host of Connections, James Burke, argued that the chain of inventions that resulted in the computer started with ancient Sumerian astrology. Each link in this chain of invention, from horoscopes to software, spanned decades or centuries.
Therefore, while a particular subject might lead in a lot of interesting and potentially important directions, it's not always obvious which questions about that topic will have immediate relevance. Thought leadership, for example, is always going to suggest all kinds of interesting questions. Is thought leadership purely based in perception, or are there more tangible indicators? Are there levels of thought leadership, or is it strictly a yes-or-no question, such as Is Microsoft still in business? or Did you take out the garbage? (I can assure you, from experience, that Sort of is not an acceptable answer to the latter question.)
We can fill up our research agendas with all kinds of topics. Sadly, we are mere mortals, capable of writing only so many documents each quarter. Therefore, we stick with the questions that help our clients right away instead of the ones that seem interesting to us. If I fail to apply that discipline to a piece of research, my boss will have a few blistering words to say about it.
Thought Leadership Leads In Too Many Directions
Still, we can't always guess with 100% accuracy which among a broad menu of questions is the one that satisfy our clients' taste for relevance. Thought leadership is more of a vast smörgåsbord of potential topics, so perhaps the most important feedback we've received during this project happened at the very beginning, in helping us identify what aspects of thought leadership we needed to cover.
Every day that I work on this project, I silently thank the community members who told us to focus on finding a working definition of thought leadership, plus some way of measuring it. If I had not received that feedback, I would have written a much different document. While I definitely would have tried to answer the question, What is thought leadership? I would have done it in passing, as I moved swiftly on to the main topic I had intended, the path to thought leadership.
I'm sure we'll address that question in a future piece of research. (In fact, during this project, we've kept a careful eye on how this new approach might help us better identify potential follow-up studies.) It would have done less good to start with that question, if the more fundamental question in the rapidly-evolving technology industry is, Am I a thought leader or not?
As a result of this experience, I'll be extra careful to include client feedback on the selection of question, not just the selection of topic. To the extent that this experiment has had some immediate effect on my own behavior, I'd say it's a success.