That question seems to come up often. I know I’m sitting on valuable data but I’m not sure just how valuable. When it comes to using the data internally to improve operational efficiency or service delivery, the resulting cost savings demonstrates the value. Or when using the data to identify new customer opportunities, either upsell to existing customers or identifying potential new customers, the resulting revenue generated demonstrates the value. But what if I want to take the data to market? What’s the data worth? That question is harder to answer, but not impossible.
The first question I’d ask myself is what I already know. What are the givens in the equation? Think back to a math course. You are trying to solve a problem. What have you been told? In fact, I’ve been doing math with my son and that exercise has helped me in framing the approach to pricing data. We know the length of one side of the triangle, and we know the relationship with the other sides. While we don’t know the length of all sides we know enough to figure it out.
In scanning through my O’Reilly Data Newsletter today, I noticed A Healthy Dose of Data, an MIT Sloan case study on the data and analytics culture at Intermountain, a healthcare network that runs 22 hospitals and 185 clinics. The study is definitely worth the read. It reviews the history of data use at Intermountain, which began way before the “big data” craze of recent years. In fact, it was back in the 1950s that one of the Intermountain cardiologists, Homer Warner, began to explore clinical data to understand why some heart patients experienced better outcomes than others. He went on to become known as the “father of medical informatics – the use of computer programs to analyze patient data to determine treatment protocols,” and with colleagues designed and launched their first decision-support tool.
The case study goes on to describe how Intermountain has cultivated a strong data and analytics culture. Over time – Rome was not built in a day, as they say – they established data maturity across the organization by investing in the capacity (new tools and technologies), developing the competencies (new skills and processes) and finally spreading the culture (awareness, understanding and best practices) of data and analytics. Their analytical approach brought results – fewer surgical infections, more effective use of antibiotics, less time in intensive care etc – contributing to lower costs, better medical outcomes, and overall patient satisfaction.
An inquiry call from a digital strategy agency advising a client of theirs on data commercialization generated a lively discussion on strategies for taking data to market. With few best practices out there, the emerging opportunity just might feel like space exploration – going boldly where no man has gone before. The question is increasingly common. "We know we have data that would be of use to others but how do we know? And, which use cases should we pursue?" In It's Time To Take Your Data To Market published earlier this fall, my colleagues and I provided some guideance on identifying and commercializing that "Picasso in the attic." But the ideas around how to go-to-market continue to evolve.
In answer to the inquiry questions asked the other day, my advice was pretty simple: Don’t try to anticipate all possible uses of the data. Get started by making selected data sets available for people to play with, see what it can do, and talk about it to spread the word. However, there are some specific use cases that can kick-start the process.
Look to your existing customers.
The grass is not always greener, and your existing clients might just provide some fertile ground. A couple thoughts on ways your existing customers could use new data sources:
The data economy — or the system that provides for the exchange of digitized information for the purpose of creating insights and value — grew in 2014, but in 2015 we’ll see it leap forward significantly. It will grow from a phenomenon that mainstream enterprises view at arm’s length as interesting to one that they embrace as a part of business as usual. The number of business and technology leaders telling us that external data is important to their business strategy has been growing rapidly -- from one-third in 2012 to almost half in 2014.
Why? It’s a supply-driven phenomenon made possible by widespread digitization, mobile technology, the Internet of Things (IoT), and Hadooponomics. With countless new data sources and powerful new tools to wrest insights from their depths, organizations will scramble to use them to know their customers better and to optimize their operations beyond anything they could have done before. And while the exploding data supply will spur demand, it will also spur additional supply. Firms will be taking a hard look at their “data exhaust” and wondering if there is a market for new products and services based on their unique set of data. But in many cases, the value in the data is not that people will be willing to pay money for bulk downloads or access to raw data, but in data products that complement a firm’s existing offerings.
I had a fascinating inquiry this morning with a government securities commission (not the SEC and not in the US). The client had a classic question about how to navigate the new data economy. The commission produces and consumes large volumes of data but continue to struggle to answer persistent business questions like how well they are doing or even who they are doing it for. Yes, securities commissions regulate securities markets; they monitor publically traded companies, investment houses, and brokerage firms. Howevver they continue to ask, “for whom?” Who are the investors that they are protecting with their regulation? As they expressed the question, “How do we know what Mrs. Smith is investing in?” They currently work with several large data providers who provide financial information on companies but that information wasn’t exactly what they were looking for. Essentially, in this Age of the Customer, they want to know who their “customers” are. This was a question about how to best serve their customers, in this case the investors.
They wanted to know how to source additional third-party data that would give them a clearer picture of the investors that they are serving. Census data provides a wealth of information about households and individual finances. But the data teams at the commission are not experts in navigating census data. Data providers like Thompson-Reuters provide data on the financial services industry. Others such as Experian or Acxiom provide information on consumers. What kinds of other data providers can help them with their data strategy to answer that basic question of how to better serve their customers, and who they are?
An explosion of data is revolutionizing business practices. The availability of new data sources and delivery models provides unprecedented insights into customer and partner behavior and enables much improved capacity to understand and optimize business processes and operations. Real time data allows companies to fine tune inventories and in-store product placement; it allows restaurants to know what a customer will order, even before they read the menu or reach the counter. And, data is also the foundation for new services offerings for companies like John Deere or BMW or Starwood.
Last month, GovLabs, a research organization at New York University released a beta version of its Open Data 500 project. The study set out to profile US companies that use open data to generate new business and develop new products and services. Not all of the companies identified have been profiled but the list of 500 provides a wide range of both existing companies and start-ups that benefit from the use of open data.
While the start-ups are interesting illustrations of innovation and economic value-creation, the presence of big, existing companies illustrates how data transforms business.
Insurance companies such as AllState and Allianz no longer only insure people and property.
As an analyst on Forrester's Customer Insight's team, I spend a lot of time counseling clients on best-practice customer data usage strategies. And if there's one thing I've learned, it's that there is no such thing as a 360-degree view of the customer.
Here's the cold, hard truth: you can't possibly expect to know your customer, no matter how much data you have, if all of that data 1) is about her transactions with YOU and you 2) is hoarded away from your partners. And this isn't just about customer data either -- it's about product data, operational data, and even cultural-environmental data. As our customers become more sophisticated and collaborative with each other ("perpetually connected"), so organizations must do the same. That means sharing data, creating collaborative insight, and becoming willing participants in open data marketplaces.
Now, why should you care? Isn't it kind of risky to share your hard-won data? And isn't the data you have enough to delight your customers today? Sure, it might be. But I'd put money on the fact that it won't be for long, because digital disruptors are out there shaking up the foundations of insight and analytics, customer experience, and process improvement in big ways. Let me give you a couple of examples:
Banks have a reputation for being stodgy and conservative. But Credit Agricole (CA) has broken the stereotype. I had a great discussion a few weeks ago with Bernard Larrivière, Director of Innovation, and Emmanuel Methivier, the CA Store Manager, about the CA Store launched last fall. The store houses new services developed by third-party developers using the bank’s secure customer data — one small step for CA, one giant step for the banking industry and the data economy.
The CA Store was not only inspired by the Apple Store model but also by government open data initiatives. The public sector provided the model of exposing APIs to internal data and working with independent developers to encourage application creation. However, in a move that will likely be carefully watched by their public sector brethren, CA recognized the need for a better business model to incent developers to use the data, and to sustain the development and maintenance of the applications.
. . . Nor has it ever really been. Government data has long been a part of strategic business analysis. Census data provides insights into local standards of living and household budgets, health needs, education levels, and other factors that influence buying patterns for all kinds of goods and services. The US Bureau of Labor Statistics and the International Labour Organization provide data on employment and the availability of skilled labor that helps inform decisions on where to locate manufacturing or other facilities. The World Bank and UN data provides insights into global trends.
Moreover, the release of government data has itself spurred billion-dollar industries. Think weather data released in the 1970s by the National Oceanic and Atmospheric Administration – which gave birth to the weather industry and services like Accuweather, weather.com, wunderground, and newer services like ikitesurf.com’s “wind and where.” Data from the US Global Positioning System (GPS) was opened to civilian and commercial use in the 1980s and has given rise to thousands of location-based services. Think FourSquare, Yelp, and Where’s The Bus?