Stop Watching The Stock Ticker And Start Improving Customer Experience

As an avid personal investor I’m often appalled by cable shows that report on the markets as if they were non-stop sporting events. Seriously, how many people care how the NASDAQ or the Dow are doing on any given minute of any given day? But apparently there are enough day traders out there that noon reports from the floor of the New York Stock Exchange are as compelling as half-time reports during the NFL playoffs.

Nah.

I have to confess that there is one piece of financial analysis that I do look forward to – though in my defense, this is an annual occurrence and not an hourly update. The analysis comes from Jon Picoult, a gentleman who runs Watermark Consulting.

For a while now Jon has been taking the data from Forrester’s Customer Experience Index (CXi) and using it to do a thought experiment. In this experiment he looks at what would have happened if, back when we first published the CXi, an investor had taken two equal buckets of money and created two U.S. stock portfolios.  The first portfolio would have consisted of the top 10 publicly traded companies in our index (the customer experience leaders). The second portfolio would have consisted of the bottom 10 publicly traded companies in the index (the customer experience laggards).

In Jon’s model the investor would have held each portfolio for a year, then sold them both and taken his profits (or losses). He would have then used the proceeds to purchase the new year’s leaders and the new year’s laggards, continuing this cycle of selling and buying for all six years that the CXi has been in existence.

Intriguing, right? Even those of us who believe in the business value of customer experience (or in my case can prove it through research) don’t normally look at the impact on stock performance.

Read more

You Asked, Forrester Answered: Questions About Customer Experience Design

Of the six disciplines in Forrester’s customer experience maturity model, design is probably the least understood. It’s is not taught in most business schools (although this is starting to change at institutions like Stanford and the University of Toronto). It’s also not widely practiced in most companies outside of specialized groups that focus on digital touchpoints. And so it remains a mystery to most business people. That’s a shame, because design is an incredibly valuable business tool — and it’s accessible to just about anyone in any organization.

That’s why I wanted to take time this week to answer some of the questions that I’m frequently asked about customer experience design. In fact, all of the following are exact questions that I’ve received from Forrester clients over the past year.

What exactly is this design thing again?

Design is both a process and a mindset

Let’s talk about the process part first. Designers typically follow a common set of steps when trying to solve a problem: research that helps them uncover deep emotional insights about people’s wants and needs, analysis that helps them identify the real problems and issues, ideation of dozens (or hundreds) of possible solutions, prototyping that helps them bring those ideas to life in tangible ways, and testing that helps them evaluate the proposed prototypes and solutions. Designers don’t go thought this process once — they iterate this process several times in order to learn from their prototypes and refine their solutions.

Read more

Avoid The "All Listen And No Action" VoC Program Trap

Voice of the customer (VoC) data is alluring. Once you start to collect customer feedback, there's always something more you could be gathering. You think: What else can I learn? What else are customers saying and thinking? Where else are they saying it? You want to know more.

But collecting the data — listening — isn't enough.

At Forrester, we describe the continuous cycle of activities that make up VoC programs as: listen, interpret, react, and monitor. "Listen" is all the customer feedback you're collecting via listening posts like surveys, emails, calls, and comment cards. "Interpret" is the analysis you do on that feedback (and other related data) to understand what it all means. "React" is what you do to fix the experience based on the analysis you've done, and "monitor" is how you make sure that whatever you did to react is actually working.

It's critical to go through the full cycle with whatever data you're already collecting. Because here's the hard truth: You get no ROI from listening or interpreting. None. Zero. Zip. You only get business results from actually improving the experience.

Read more

Experience Design Will Rule in the Post-PC Era

The last few days have been quite rough on PC-era titans Microsoft and HP. While my colleague Ted Schadler is correct in saying we're in a multi-device, "right tool for the job" era, the unfortunate truth for PC makers is that, for many consumers, the right tool for the job just so happens to be the mobile devices they carry with them, not the PC sitting in their bedroom or home office or wherever people keep them these days. In fact, 77% of mobile searches take place in the home or at work where a PC is readily available. Whether you call it lazy or convenient, the simple fact is smartphones and tablets are quickly becoming the go-to computing devices for consumers.

This shift in ownership and use behavior marks the dawn of a new age in customer experience. As I discuss in my new report, Customer Experience in the Post-PC Era, as customers shift their attention to mobile devices, their expectations are fundamentally changing. In the post-PC era, customers expect companies to provide experiences aligned with their needs and abilities, in the right context, and at their moment of need. To deliver on this, customer experiences need to become:

Read more