While data governance has been a business need for years, it is becoming more visible as a center-stage business concern. Driving this shift are new regulations and new requirements addressing consumer data ownership, privacy, and business data monetization. Two of the most important regulations are the European General Data Protection Regulation (GDPR), and the Basel Committee on Banking Supervision regulation 239 (BCBS 239). Forrester recognized this change three years ago when we described the evolution of data governance away from “data input quality” toward “data usage,” which we call data governance 2.0. Some emerging data governance solution vendors, like Collibra and GDE, have moved aggressively to address the new requirements of data governance 2.0. However, larger established vendors like IBM, Informatica, SAS, and SAP have moved more slowly, instead prioritizing investments in developing a platform supporting systems of insight.
Two recently announced acquisitions demonstrate that the larger established vendors now recognize the need for renewed data governance offerings:
Informatica’s purchase of the Diaku Axon platform. Announced on February 22, the acquisition of the Diaku Axon platform adds business-oriented capabilities like vertical knowledge (finance) and support of regulations such as GDPR and BCBS 239 to Informatica’s current data governance execution capabilities (DQ, MDM, security/masking).
Last week, I participated in a roundtable during a conference in Paris organized by the French branch of DAMA, the data management international organization. During the question/answer part of the conference, it became clear that most of the audience was confusing data management with data governance (DG). This is a challenge my Forrester colleague Michele Goetz identified early in the DG tooling space. Because data quality and master data management embed governance features, many view them as data governance tooling. But the reality is that they remain data management tooling — their goal is to improve data quality by executing rules. This tooling confusion is only a consequence of how much the word governance is misused and misunderstood, and that leads to struggling data governance efforts.
So what is “governance”? Governance is the collaboration, organization, and metrics facilitating a decision path between at least two conflicting objectives. Governance is finding the acceptable balance between the interests of two parties. For example, IT governance is needed when you would like to support all possible business projects but you have limited budget, skills, or resources available. Governance is needed when objectives are different for different stakeholders, and the outcome of governance is that they do not get the same priority. If everyone has the same objective, then this is data management.
There’s a renewed interest in integration technologies due to new needs for integration to mobile, the Internet of Things (IoT), and cloud — but also because integration requirements betwen systems of engagement and systems of record are requiring realtime for seamless boundaries omnichannel, higher volume, with end-to-end security highlight the changes in integration practices. Forrester will soon publish a report about the integration trends around these subjects.
I am happy to pick up this subject again from Stefan Ried after being away from the space for the past six years. Stefan left Forrester in December and I regret his departure, because he was a very passionate analyst and a smart guy to work with.
There are multiple maturity models and associated assessments for data governance on the market. Some are from software vendors, or from consulting companies, which use these as the basis for selling services. Others are from professional groups like the one from the Data Governance Council.
They are all good – but frankly not adequate for the data economy many companies are entering into. I think it is useful to reshuffle some too well established ideas...
Maturity models in general are attractive because:
Using a maturity model is nearly a “no-brainer” exercise. You run an assessment and determine your current maturity level. Then you can make a list of the actions which will drive you to the next level. You do not need to ask your business for advice, nor involve too many people for interviews.
Most data governance maturity models are modeled on the very well known CMMI. That means that they are similar at least in terms of structure/levels. So the debate between the advantages of one vs another is limited to its level of detail.
There are interesting debates all around the globe about whether there is the need for a next gen EA framework. James Lapalme recently published an excellent article: Three Schools of Thought on Enterprise Architecture explaining the reasons of such debates.
In this article James identifies three schools of thoughts for EA, each with their own scope and purpose:
"Enterprise IT architecting" which addresses enterprisewide IT, and the alignment of IT with business.
"Enterprise integrating" which addresses the coherency of the enterprise as a system with IT is only one component of the enterprise.
"Enterprise Ecological Adaptation" which addresses the enterprise in its larger environment
Enterprise architects I talk with are struggling with the pace of change in their business.
We all know the pace of change in business, and in the technology which shapes and supports our business, is accelerating. Customers are expecting more ethics from companies and also more personalized services but do not want to share private information. Technology is leveling the playing field between established firms and new competitors. The economic, social, and regulatory environment is becoming more complex.
What this means for enterprise architects is that the founding assumptions of EA — a stable, unified business strategy, a structured process for planning through execution, and a compelling rationale for EA’s target states and standards — don’t apply anymore. Some of the comments I hear:
“We’re struggling with getting new business initiatives to follow the road maps we’ve developed.”
“By the time we go through our architecture development method, things have changed and our deliverables aren’t relevant anymore.”
“We are dealing with so many changes which are not synchronized that we are forced to delay some of the most strategic initiatives and associated opportunities.”
The bottom line is that the EA methods available today don’t handle the continuous, pervasive, disruption-driven business change that is increasingly the norm in the digital business era. Our businesses need agility — our methods aren’t agile enough to keep up.
IT has too many separate portfolios to manage, and that hinders its ability to help business change. We have project portfolios, application portfolios, technology portfolios, and IT service portfolios – each managed in silos. These portfolios are all IT-centric – they generally mean nothing to business leaders. The business has products, customers, partners, and processes – and the connection between these business portfolios and the IT portfolios isn't readily apparent and usually not even documented. Change in the business – in any of these areas – is connected to IT only in the requirements document of a siloed project. Lots of requirement documents for lots of siloed projects leads to more complexity and less ability to support business change.
How do we connect these business concepts to IT? What's the "unit" that connects IT projects, apps, and technology with business processes and products?
It's not "business capabilities" – they are an abstraction most useful for prioritizing, analysis, and planning. We need a term to manage the day-to-day adaptation and implementation of these capabilities – the implementation with all its messiness such as fragmented processes and redundant apps – that we can use to manage any type of change.
We believe the best term for this unit is "business services," with this definition:
The output of a business capability with links to the implementation of people, processes, information, and technology necessary to provide that output.
I say "finally" because most of the ideas for these documents were collected during the research Diego Lo Giudice and I did for Forrester's EA Forum 2010, nearly one year ago. If the ideas are quick to come, they sometimes take a long time to be realized in a document! I apologize to the customers who were waiting for the final document.
The goal of this collection of documents is to demonstrate typical EA involvement in IT governances — an area that is usually more or less "beyond" EA's scope. We also said in the EA Forum presentation that these potential involvements are not mandatory and highly depend on your particular EA objectives. EA involvement in IT governance should remain in line with the recommendation we made in Forrester report "Avoid The EA Governance Versus Agility Trap" and in which we still continue to believe: Governance is a lever to obtain nonshared (or even diverging) objectives. When objectives are shared, then governance is not required, and the approach should remain agile.
EA teams like to know how mature their EA practice is. There are a lot of EA maturity models out there. You will find some of these assessments and maturity models discussed in a 2009 Forrester report. Many EA teams share the idea that there is a single “ultimate EA model” and that EA leaders should strive to move up the ladder to this ultimate model. It’s like a video game – you try to get to the next level.
For the past three months, the EA team’s Researcher Tim DeGennaro has been looking at these models and Forrester’s research on EA best practices to create a framework for assessing EA programs. This looked deceptively simple: Develop criteria based on the best practices we see in leading EA organizations, create an objective scale to rate an organization’s progress, offer reporting to illuminate next steps, and wrap it in an easy-to-use assessment package. What we’ve found so far is not only that avoiding the effects of subjectivity and lack of context is impossible but also that many assessment styles disagree on the most crucial aspect: What exactly is EA supposed to be aiming for?