Spending time at the MDM/DG Summit in NYC this week demonstrated the wide spectrum of MDM implementations and stories out in the market. It certainly coincides with our upcoming MDM inquriry analysis where:
An IT mindset has dominated the way organizations view and manage their data. Even as issues of quality and consistency raise their ugly head, the solution has often been to turn to the tool and approach data governance in a project oriented manner. Sustainability has been a challenge, relegated often to IT managing and updating data management tools (MDM, data quality, metadata management, information lifecycle management, and security). Forrester research has shown that less than 15% of organizations have business lead data governance that is linked to business initiatives, objectives and outcomes. But, this is changing. More and more organizations are looking toward data governance as a strategic enterprise competence as they adopt a data driven culture.
This shift from project to strategic program requires more than basic workflow, collaboration, and data profiling capabilities to institutionalize data governance policies and rules. The conversation can't start with data management technology (MDM, data quality, information lifecycle management, security, and metadata management) that will apply the policies and rules. It has to begin with what is the organization trying to achieve with their data; this is a strategy discussion and process. The implication - governing data requires a rethink of your operating model. New roles, responsibilities, and processes emerge.
Coming back from the SAS Industry Analyst Event left me with one big question - Are we taking into account the recommendations or insights provided through analysis and see if they actually produced positive or negative results?
It's a big question for data governance that I'm not hearing discussed around the table. We often emphsize how data is supplied, but how it performs in it's consumed state is fogotten.
When leading business intelligence and analytics teams I always pushed to create reports and analysis that ultimately incented action. What you know should influence behavior and decisions, even if the influence was to say, "Don't change, keep up the good work!" This should be a fundamental function of data govenance. We need to care not only that the data is in the right form factor but also review what the data tells us/or how we interpret the data and did it make us better?
I've talked about the closed-loop from a master data management perspective - what you learn about customers will alter and enrich the customer master. The connection to data governance is pretty clear in this case. However, we shouldn't stop at raw data and master definitions. Our attention needs to include the data business users receive and if it is trusted and accurate. This goes back to the fact that how the business defines data is more than what exists in a database or application. Data is a total, a percentage, an index. This derived data is what the business expects to govern - and if derived data isn't supporting business objectives, that has to be incorporated into the data governance discussion.
The last Forrester Wave for MDM was released in 2008 and focused on the Customer Hub. Well, things have certainly changed since then. Organizations need enterprise scale to break down data silos. Data Governance is quickly becoming part of an organization's operating model. And, don't forget, the big elephant in the room, Big Data.
From 2008 to now there have been multiple analyst firm evaluations of MDM vendors. Vendors come, go or are acquired. But, the leaders are almost always the same. We also see inquiries and implementations tracking to the leaders. Our market overview report helped to identify the distinct segments of MDM vendors and found that MDM leaders were going big, leveraging a strategic perspective of data management, a suite of products, and pushing to support and create modern data management environments. What needed to be addressed, how do you make a decision between these vendors?
The Forrester Wave for the Multi-Platform MDM market segment gets to the heart of this question by pushing top vendors to differentiate amongst themselves and evaluating them at the highest levels of MDM strategy. There were things we learned that surprised us as well as where the line was drawn between marketing messaging and positioning and real capabilities. This was done by positioning the Wave process the way our clients would evaluate vendors, rigorously questioning and fact checking responses and demos.
Outside of Tempe is a place called Sahuarita, Arizona. Sahuarita is the home of Air Force Silo #571-7 where a Titan missile, that was part of the US missile defense system and had a nine-megaton warhead that was at the ready for 25 years, should the United States need to retaliate against a Soviet nuclear attack. This missile could create a fireball two miles wide, contaminate everything within 900 square miles, hit its target in 35 minutes, and nothing in the current US nuclear arsenal comes close to its power. What kept it secure for 25 years? You guessed it...four phones, two doors, a scrap of paper, and a lighter.
Photo Credit: Renee Murphy
Technology has grown by leaps and bounds since the cold war. When these siloes went into service, a crew supplied by the Air Force manned them. These men and women were responsible for ensuring the security and availability of the missile. Because there was no voice recognition, retinal scanning, biometric readers, and hard or soft tokens, the controls that were in place were almost entirely physical controls. All of the technology that we think of as keeping our data and data centers secure hadn’t been developed yet. It is important to note that there was never a breach. Ever.
It might be an occupational hazard, but I can relate almost anything to security and risk management, and my visit to the Titan Missile Museum at AF Silo #571-7 was no exception. The lesson I took from my visit: there's room for manual controls in security and risk management.
I had the opportunity to speak and participate in a panel on data governance as it pertained to big data. My presentation was based on recently completed research sponsored by IBM to understand, what does data governance look like by firms embarking/executing on big data? The overarching theme was that data governance is about protect and serve. Manage security and privacy while delivering trusted data.
Yet, when you look at data governance and what it means to the data practice, not the technology, protect and serve is also a credo. In business terms it represents:
Protect the reputation and mitigate risk associated with inappropriate use or dirty data.
Serve information needs of the business to have information fast and stay agile to market conditions.
There are multiple maturity models and associated assessments for Data Governance on the market. Some are from software vendors, or from consulting companies, which use these as the basis for selling services. Others are from professional groups like the one from the Data Governance Council.
They are all good – but frankly not adequate for the data economy many companies are entering into. I think it is useful to reshuffle some too well established ideas...
Maturity models in general are attractive because:
- Using a maturity model is nearly a ‘no-brainer’ exercise. You run an assessment and determine your current maturity level. Then you can make a list of the actions which will drive you to the next level. You do not need to ask your business for advice, nor involve too many people for interviews.
- Most data governance maturity models are modeled on the very well known CMMI. That means that they are similar at least in terms of structure/levels. So the debate between the advantages of one vs another is limited to its level of detail.
But as firms move into the data economy – with what this means for their sourcing, analyzing and leveraging data, I think that today’s maturity models for data governance are becoming less relevant – and even an impediment:
As an analyst on Forrester's Customer Insight's team, I spend a lot of time counseling clients on best-practice customer data usage strategies. And if there's one thing I've learned, it's that there is no such thing as a 360-degree view of the customer.
Here's the cold, hard truth: you can't possibly expect to know your customer, no matter how much data you have, if all of that data 1) is about her transactions with YOU and you 2) is hoarded away from your partners. And this isn't just about customer data either -- it's about product data, operational data, and even cultural-environmental data. As our customers become more sophisticated and collaborative with each other ("perpetually connected"), so organizations must do the same. That means sharing data, creating collaborative insight, and becoming willing participants in open data marketplaces.
Now, why should you care? Isn't it kind of risky to share your hard-won data? And isn't the data you have enough to delight your customers today? Sure, it might be. But I'd put money on the fact that it won't be for long, because digital disruptors are out there shaking up the foundations of insight and analytics, customer experience, and process improvement in big ways. Let me give you a couple of examples:
I met with a group of clients recently on the evolution of data management and big data. One retailer asked, “Are you seeing the business going to external sources to do Big Data?”
My first reaction was, “NO!” Yet, as I thought about it more and went back to my own roots as an analyst, the answer is most likely, “YES!”
Ignoring nomenclature, the reality is that the business is not only going to external sources for big data, but they have been doing it for years. Think about it; organizations that have considered data a strategic tool have invested heavily in big data going back to when mainframes came into vogue. More recently, banking, retail, consumer packaged goods, and logistics have marquis case studies on what sophisticated data use can do.
Before Hadoop, before massive parallel processing, where did the business turn? Many have had relationships with market research organizations, consultancies, and agencies to get them the sophisticated analysis that they need.
Think about the fact, too, that at the beginning of social media, it was PR agencies that developed the first big data analysis and visualization of Twitter, LinkedIn, and Facebook influence. In a past life, I worked at ComScore Networks, an aggregator and market research firm analyzing and trending online behavior. When I joined, they had the largest and fastest growing private cloud to collect web traffic globally. Now, that was big data.
Today, the data paints a split picture. When surveying IT across various surveys, social media and online analysis is a small percentage of business intelligence and analytics that is supported. However, when we look to the marketing and strategy clients at Forrester, there is a completely opposite picture.
I recently had a client ask about MDM measurement for their customer master. In many cases, the discussions I have about measurement is how to show that MDM has "solved world hunger" for the organization. In fact, a lot of the research and content out there focused on just that. Great to create a business case for investment. Not so good in helping with the daily management of master data and data governance. This client question is more practical, touching upon:
what about the data do you measure?
how do you calculate?
how frequently do you report and show trends?
how do you link the calculation to something the business understands?