Happy New Year! As we kick off 2012, I’d like to reflect on what was accomplished during the past year in the “trusted data” areas of master data management (MDM), data quality (DQ), and data governance and consider what we might expect in the year to come. I also hear quite a bit of noise from vendors and analysts alike about what they want the MDM market to be in 2012, so I wanted to share my thoughts on what’s real and what (in my opinion) remains hype.
I also just completed Forrester’s December 2011 Global MDM Survey of 274 MDM professionals. While the majority of those results will be shared in the annual MDM Trends research that I’ll be publishing later in Q1, here’s a taste of some of the intriguing results.
Let’s first reflect on what I’ve witnessed from my clients’ MDM journeys throughout 2011:
Data governance remained a challenge. In the abovementioned MDM survey, only 20% responded that they have a high or very high level of data governance maturity, indicating that significant work remains. But on the positive side, I’m witnessing increasing business sponsorship and prioritization, which has helped many organizations pilot programs to cut their teeth and build some repeatable processes, foundational policies, and early measurements to start building a case to increase data governance investment and momentum.
Multidomain MDM hit its stride. User interest in multidomain MDM strategies has finally caught up with vendors’ product capabilities and messaging. In Forrester’s MDM Survey, 47% responded that the scope of their MDM programs include more than two data domains to master, while another 9% are focused on dual-domain solutions (e.g, customer and product).
In October, my colleague Brian Hopkins published Forrester’s extremely popular enterprise architecture (EA) trends research, the Top 10 Business Technology Trends EA Should Watch: 2012 to 2014. As in past iterations of this research, master data management (MDM), along with the data governance capabilities required to support it, remain among the top technology strategies that enterprise architects expect to deliver the most business value to their firms, as well as require the most change to their firm’s technology landscape over the next three years.
In 2009, the anticipated trend around MDM was that it was going to significantly mature both from a technology and architecture standpoint, but also in the skills, best practices, and methodologies used to effectively deliver MDM capabilities. I believe that the maturity of MDM practitioners has in fact increased significantly over the past two years. In August, I published a report titled Master Data Management: Customer Maturity Takes A Great Leap Forward, which analyzed about 175 MDM- and data governance-related client inquiries Forrester received between January 2010 and June 2011. The crux of that research was to demonstrate that our clients are asking much more practical and insightful questions about MDM architectures, best practices, strategies, governance, and vendor selection than in years past.
Brian’s EA trends report identifies two specific trends around data management and governance:
Most master data management (MDM), data quality, and accompanying data governance efforts prioritize customer, account, and product data over all others. Certainly, industry-specific exceptions exist; for example, energy, utility, and oil and gas companies place a high priority on asset and location data domains, while investment management firms prioritize securities. But exceptions aside, a recent Forrester survey of 298 business process management (BPM) and MDM professionals across industries found that 83% prioritized customer data, 61% product data, and 53% account data. And coming in at 44%, the next highest priority: the red-headed stepchild of the MDM “party” (pun intended — apologies for that), employee data!
It’s no surprise that customer/account and product data-centric MDM programs get the lion’s share of funding, executive sponsorship, and prioritization within most organizations. This data is the lifeblood of your customer engagement and supply/distribution chain, with quantifiable impacts to both top- and bottom-line success, and can be positioned as a major competitive differentiator. But even more relevant, those MDM efforts are often driven by sales, marketing, finance, operations, or risk management functional organizations — all of which are typically better funded than many human resource (HR) teams, especially when it comes to IT budgeting. Of course, this isn’t always the case, and many large enterprises spend millions of dollars optimizing their HR systems infrastructure. Applications supporting learning management, performance and talent management, recruiting, time and attendance, benefits administration, compensation planning and analysis, and organizational charting and employee directories all require high-quality employee and organizational data.
Data management and BI professionals often feel pressure from senior management to propose and start implementing master data management (MDM), data quality, data warehousing, business intelligence (BI), analytics or other data management strategies quickly, without time to perform the necessary due diligence. These “fire drill” strategy sessions may arise as a reaction to a compelling event like a compliance or regulatory action, the need to support better management planning and decision-making during economic struggles, or even by the arrival of a new senior executive (e.g., CEO, CIO, CFO, COO, CMO) looking to make their mark on the organization by driving this strategy.
Unfortunately the program drivers on the hook to deliver these catch-up strategy planning initiatives tend to disregard many best practices in the process. Can you blame them? Many of them have been the organizational evangelists that have fought for months – or even years – to get sponsorship and investment to deliver these solutions. When that support finally arrives, they’d be crazy to turn it away just because the timelines are a bit aggressive, right? Well yes, they should push back if the solution they’re building will not:
Deliver a clear ROI to deliver clear business value with a line of sight to how the capabilities will improve efficiencies, reduce cost, reduce risk, increase revenue, or strategically differentiate your organization. Think that executive sponsor will have your back if you can’t prove the value? Think again.
Scale and offer the flexibility and agility to support the next set of incremental requirements or users that will inevitably come along.
Guarantee end user adoption and acceptance of the new solution that will likely introduce new processes, technologies, and/or organizational changes.
I was recently chatting with Jim Harris, the well-respected blogger-in-chief of the Obsessive-Compulsive Data Quality blog, about one of our favorite topics: data governance best practices. Our conversation migrated to one of data governance’s biggest challenges: how to balance bureaucracy and business agility.
So Jim and I thought it would be fun to tackle this dilemma in a Star Wars-themed debate across each of our individual blog platforms, with Jim taking the position for “Agility” as the Rebellion and me taking the opposing position for “Bureaucracy” as the Empire.
Note: Yes, most conversations between self-proclaimed data geeks tend to result in Star Wars or Star Trek parallels . . . and I lost the coin toss. Thankfully, I found StarWars.com to help me with some of my rusty Star Wars facts!
Disclaimer: Remember, this is meant to be a true debate format, where Jim and I are intentionally arguing polar opposite positions with full knowledge of the reality that data governance success requires effectively balancing bureaucracy and agility.
Please take the time to read both of our blog posts, then we encourage your comments — and your votes (see the poll below).
I recently received a client inquiry that I see flavors of a few times per quarter. The client said that they are trying to explore ways to establish information value within their enterprise. Because people often frame data and information as an asset, then shouldn’t we be able to establish its value?
What I share with my clients is that trying to place a monetary value on data and information itself is a red herring, an effort that I highly recommend all avoid – unless you enjoy philosophical exercises that don’t translate to actual business value. (Apologies to those that fit in this camp — have fun!)
The “data is an asset” rhetoric doesn’t translate to putting a monetary value on a customer record, as an example, because data in and of itself has no value! The only value data/information has to offer — and the reason I do still consider it an “asset” at all — is in the context of the business processes, decisions, customer experiences, and competitive differentiators it can enable.
For example, a customer record doesn’t have value unless you can sell, market, or service that customer. So for each customer record, many customer intelligence analysts calculate lifetime value scores, the potential share of wallet available, the customer’s propensity to buy certain products and services, and even the cost of servicing the customer. But that doesn’t put a value on the customer record itself: It places the value based on the sales, marketing, and service processes the data supports. And that’s where the data value should live: in the consuming processes.
March 6, 2011 marked my five-year anniversary with Forrester Research — and as an industry analyst. Back in 2006 I made a pivotal career decision and decided to depart the end user world where I spent almost 15 years across multiple organizations fighting the good fight building data management solutions that delivered trusted data to the people, processes, and systems that needed it. Amazing to me that I graduated college as a finance major with no IT experience and at the time wouldn’t have known a database if it crashed right in front of me.
For my first job out of school, I worked in Newark, NJ, for six years with Thomson Financial Services (TFS) on its Global Mergers & Acquisitions database product, eventually taking on the role of research manager for that product. In that time I foolishly thought I was gaining M&A expertise, but in reality I was learning how to ensure that the data about these M&A transactions was of the highest quality because data at TFS is not just an asset — it’s its product. To accomplish this, I had to became very proficient with this random programming language called SQL, which I assumed was some niche thing and most likely irrelevant beyond TFS (remember: finance major!). Still being the days of green-screened dummy terminals before the PC and GUI revolution, learning SQL was an incredible eye-opener.
For the second year in a row, Forrester Research has targeted master data management (MDM) as one of the highest-impact technologies that enterprise architects must keep an eye on. Forrester Vice President and Principal Analyst Gene Leganza published “The Top 15 Technology Trends EA Should Watch: 2011 To 2013” research in October, and Gene smartly positions MDM along with next-gen business intelligence, advanced text and social analytics, and information-as-a-service integration architectures as key enablers to deliver what Forrester is calling “process-centric data and intelligence”.
Data governance is not – and should never have been – about the data. High-quality and trustworthy data sitting in some repository somewhere does not in fact increase revenue, reduce risk, improve operational efficiencies, or strategically differentiate any organization from its competitors. It’s only when this trusted data can be delivered and consumed within the most critical business processes and decisions that run your business that these business outcomes can become reality. So what is data governance all about? It’s all about business process, of course.
As a lifelong Yankees fan (which makes me a pariah with many of my Red Sox Country-based Forrester coworkers in Cambridge, Mass.), I’ve been following with amusement the sports media frenzy around the New York Yankees' "not-so-public yet not-so-private" contract negotiations with their star shortstop, Derek Jeter. While I read these news snippets with the intent of escaping the exciting world of data management for just a brief moment, I couldn’t escape for long because both sides of the table bring up reams of data to defend their positions.
According to the media reports and analysis, the Yankees' ownership is seemingly paying less attention to Jeter’s Hall of Fame-worthy career statistics, including a fantastic 2009 season, and his intrinsic value to the Yankees brand, but instead is focusing on Jeter’s arguably career-low 2010 season on-field performance and advancing age (36 years old is practically Medicare-eligible age in baseball).
Jeter’s side of the negotiations, on the other hand, point out that Jeter’s value to the Yankees is “immeasurable,” and that one off year shouldn’t be used to define his value to the team. They point out that Jeter, as team captain, is a major leader in the clubhouse and excellent role model for younger players. He’s certainly among the most popular players the Yankees employ and influences boatloads of fans to attend games, watch the Yankees cable network, and provide significant licensing revenue. And of course they point out that Jeter is still an excellent player and 2010 should be viewed as an anomaly, not the norm.
I’m not a baseball analyst (lucky for everyone), and I have no intention in joining the debate on whose point of view is correct or how much Jeter should earn, for how many years, etc. (That’s best discussed over a few beers, not on a blog, right?)
I’ve just published my comprehensive analysis of the enterprise data quality platforms market in my Forrester Wave™ report and provided a deep-dive assessment of the technologies, strategies, and market presence of the six vendors that, in my opinion, provide the most mature, robust, and comprehensive data quality and data profiling solutions available. The evaluated vendors include DataFlux, IBM, Informatica, Harte-Hanks Trillium Software, Pitney Bowes Business Insight, and SAP BusinessObjects.
DQ Market Overview
A fair question some may ask when they see this research is “Why evaluate only six vendors?” The reason is a simple cost/benefit analysis. As any vendor that has participated in this process can confirm, the Forrester Wave™ research methodology is at minimum a 6+ month, in-depth product and vendor strategy evaluation and the more vendors that are included only adds to the effort. So when determining my inclusion criteria, I had to ask — as always — who is the target audience for this research? The answer, of course, is the data management professionals that work for Forrester’s clients, who for the most are large commercial enterprises and public sector organizations that often support global operations.