You know what the Holy Grail is for an analyst? It’s results data – especially financial results data. And that’s especially true for analysts who cover customer experience because all too often CX professionals don’t track – or won’t share – their results.
That’s why I’m especially pleased with what I am able to share with you today.
Last week I posted part 1 of Forrester’s customer experience Q&A with Olivier Mourrieas of E.On, one of the world's largest investor-owned electric utility service providers. Olivier will be speaking at Forrester’s Forum for Customer Experience Professionals EMEA in London on November 17 and 18, 2014, and he was kind enough to share some thoughts with us in advance of his appearance.
This week I’m posting part 2 of Olivier’s answers, in which he tells us the tangible business results that the E.On CX program has achieved.
I hope you enjoy what he has to say and I look forward to seeing some of you in London!
Q: How do you measure the success of your customer experience improvement efforts (e.g., higher customer satisfaction, increased revenue, lower costs)? And have you seen progress over time?
There are hard and soft benefits which we are continuously demonstrating:
Churn reduction: Increasing Net Promoter Score (NPS) leads to increased loyalty. This will help to stabilise the Private Household and SME customer base.
At its annual Energy Analyst And Sourcing Advisor Event in Berlin, Deutsche Telekom/T-Systems re-emphasized its commitment to service the energy sector with a dedicated offering. Over the last three years, Deutsche Telekom has spent significant resources in building up expertise to become a platform and service provider for the utility sector. Our main observations during the event were that Deutsche Telekom:
Smart meters provide consumers with granular data on how they are consuming energy — when is the meter spinning fastest, which appliances are the energy guzzlers, how much energy are those idling appliances consuming? Programs to increase consumer awareness and shift demand to off-peak times abound. I delay the start of my dishwasher to after 11pm here in France to take advantage of off-peak tariffs. Most consumers, however, are not highly motivated by just knowing their own consumption. Good news: Opower, a provider of really smart energy solutions, has cracked the code.
The Opower solution draws on a study of how messages influence consumption. Turns out, if you tell people that they will save money by turning off their air conditioning and turning on a fan during peak hours they likely won’t. Those are typically the times when it is really hot. Messages of “civic responsibility” and “saving the environment” also don’t really register. However, when consumers are told that 75% of their neighbors will turn off their air conditioning and turn on a fan, behavior changes. That message had a 6% drop in consumption. Opower now uses these types of comparisons in all of their offerings.
This week, the New York Times ran a series of articles about data center power use (and abuse) “Power, Pollution and the Internet” (http://nyti.ms/Ojd9BV) and “Data Barns in a Farm Town, Gobbling Power and Flexing Muscle” (http://nyti.ms/RQDb0a). Among the claims made in the articles were that data centers were “only using 6 to 12 % of the energy powering their servers to deliver useful computation. Like a lot of media broadsides, the reality is more complex than the dramatic claims made in these articles. Technically they are correct in claiming that of the electricity going to a server, only a very small fraction is used to perform useful work, but this dramatic claim is not a fair representation of the overall efficiency picture. The Times analysis fails to take into consideration that not all of the power in the data center goes to servers, so the claim of 6% efficiency of the servers is not representative of the real operational efficiency of the complete data center.
On the other hand, while I think the Times chooses drama over even-keeled reporting, the actual picture for even a well-run data center is not as good as its proponents would claim. Consider:
A new data center with a PUE of 1.2 (very efficient), with 83% of the power going to IT workloads.
Then assume that 60% of the remaining power goes to servers (storage and network get the rest), for a net of almost 50% of the power going into servers. If the servers are running at an average utilization of 10%, then only 10% of 50%, or 5% of the power is actually going to real IT processing. Of course, the real "IT number" is the server + plus storage + network, so depending on how you account for them, the IT usage could be as high as 38% (.83*.4 + .05).
Last week, I attended the ONS (Offshore North Sea) 2010 conference, one of the world’s largest energy conferences, with more than 49,000 participants, in Stavanger, Norway. The conference theme was “energy for more people,” an important goal, not only to keep pace with the growth of the world’s population (expected to hit 9-plus billion people by 2050) but to fight poverty and increase living standards around the globe. However, soon after the opening ceremony by King Harald V, it became very clear from the first panel discussion that the path forward to achieve this goal has many facets and that the leaders of the world, including politicians, academics, business people, and other authorities, are far from reaching consensus on the right path today.
Conventional Energy Resources
Global energy demand will increase by ~45% within the next 20 years (according to the International Energy Agency), but what will the distribution of energy resources look like by 2030? Most scenarios predict that fossil fuels will continue to be the primary energy source, with oil and gas making up 65% of the total demand. To no one’s surprise, most of the presentations and exhibitions at ONS 2010 were therefore dedicated to the future of fossil fuels that can be combined into the following themes to satisfy the energy demand of tomorrow:
Unlocking new oil and gas reserves in the world. The concept seems to be straightforward: Overcome technical and political hurdles and drill deeper, faster, and more efficiently to carry exploration into new territories such as the Arctic or ultra-deep sea.
Some days ago at Forrester’s IT Forum in Lisbon (June 9-11) I gave a presentation together with my colleague Andy Bartels on the IT market recovery (we predict a 9.3% IT market growth in 2010) after two economically challenging years in 2008/9. In fact, we were making the point that the market rebound we currently see is not simply a recovery but the beginning of a new IT hyper growth phase fueled by a new wave of innovation.
A strong driver of this innovation is what we call Smart Computing at Forrester: the integration of physical world information into intelligent IT-supported business processes in 4 steps: Awareness (via new sensor technology), Analysis (with advanced BI solutions), Alternatives (including rules and process engines) and Action (in industry business applications), plus a 5th feedback loop of Auditability for tracking and learning.
A well-known example of smart computing solutions is smart metering in the Utilities industry. In another presentation in Lisbon, a colleague asked the audience, a room full with all the leading IT service companies, who all had an initiative running with smart metering – everyone in the room raised their hands. Then he asked who actually had more than 1-3 (pilot) projects running – and almost no one raised their hand.
Is smart metering just hype that everyone is jumping on or what is the reality of the lighthouse example of smart computing at this point in time?