The Outcome Focus Of World Class EA Programs

Alex Cullen

I recently had a conversation with a new EA practice leaders in the investment management business unit of a large multi-line insurance company.  They wanted to hear my perspectives on what a world-class EA program should look like.  They knew of all the traditional EA building blocks: standards and roadmaps, architecture domains, methodologies like TOGAF.  They had a long list of things to do, but were uncertain about which to tackle first, and had a nagging feeling that these had little to do with world-class EA programs.  We touched on EA maturity models, but quickly concluded that there isn’t an obvious and compelling business value proposition to simply ‘being mature’. 

The conversation shifted to outcomes – what are the outcomes of a world-class EA program?  IT cost reduction could be an outcome, and has been the raison d’etre of EA for years.  IT solution design quality could be an outcome, and has been the justification for architects for longer than EA has been around.  But these are all IT-centric outcomes.

We all know the world is changing.  Digital capabilities are radically impacting our customers, the competitive landscape, the regulatory context, and the operating models of businesses.  Kyle McNabb summarizes this very well in his blog post.  The mantra today is business agility in the face of all these radical changes.  Because of this, being IT-centric is no longer the hallmark of a world class EA program. 

Read more

Don't Establish Data Management Standards

Michele Goetz

A recent survey of Enterprise Architects showed a lack of standards for data management.* Best practices has always been about the creation of standards for IT, which would lead us to think that lack of standards for data management is a gap.

Not so fast.

Standards can help control cost. Standards can help reduce complexity. But, in an age when a data management architecture needs to flex and meet the business need for agility, standards are a barrier. The emphasis on standards is what keeps IT in a mode of constant foundation building, playing the role of deli counter, and focused on cost management.

In contrast, when companies throw off the straight jacket of data management standards the are no longer challenged by the foundation. These organizations are challenged by ceilings. Top performing organizations, those that have had annual growth above 15%, are working to keep the dam open and letting more data in and managing more variety. They are pushing the envelope on the technology that is available.

Think about this. Overall, organizations have made similar data management technology purchases. What has separated top performers from the rest of organizations is by not being constrained. Top performers maximize and master the technology they invest in. They are now better positioned to do more, expand their architecture, and ultimately grow data value. For big data, they have or are getting ready to step out of the sandbox. Other organizations have not seen enough value to invest more. They are in the sand trap.

Standards can help structure decisions and strategy, but they should never be barriers to innovation.

 

*203 Enterprise Architecture Professionals, State of Enterprise Architecture Global Survey Month,2012

**Top performer organization analysis based on data from Forrsights Strategy Spotlight BI And Big Data, Q4 2012

Don't Establish Data Management Standards

Michele Goetz

A recent survey of Enterprise Architects showed a lack of standards for data management.* Best practices has always been about the creation of standards for IT, which would lead us to think that lack of standards for data management is a gap.

Not so fast.

Standards can help control cost. Standards can help reduce complexity. But, in an age when a data management architecture needs to flex and meet the business need for agility, standards are a barrier. The emphasis on standards is what keeps IT in a mode of constant foundation building, playing the role of deli counter, and focused on cost management.

In contrast, when companies throw off the straight jacket of data management standards the are no longer challenged by the foundation. These organizations are challenged by ceilings. Top performing organizations, those that have had annual growth above 15%, are working to keep the dam open and letting more data in and managing more variety. They are pushing the envelope on the technology that is available.

Think about this. Overall, organizations have made similar data management technology purchases. What has separated top performers from the rest of organizations is by not being constrained. Top performers maximize and master the technology they invest in. They are now better positioned to do more, expand their architecture, and ultimately grow data value. For big data, they have or are getting ready to step out of the sandbox. Other organizations have not seen enough value to invest more. They are in the sand trap.

Standards can help structure decisions and strategy, but they should never be barriers to innovation.

 

*203 Enterprise Architecture Professionals, State of Enterprise Architecture Global Survey Month,2012

**Top performer organization analysis based on data from Forrsights Strategy Spotlight BI And Big Data, Q4 2012

What Master Data Management Metrics Matter?

Michele Goetz

I recently had a client ask about MDM measurement for their customer master.  In many cases, the discussions I have about measurement is how to show that MDM has "solved world hunger" for the organization.  In fact, a lot of the research and content out there focused on just that.  Great to create a business case for investment.  Not so good in helping with the daily management of master data and data governance.  This client question is more practical, touching upon:

  • what about the data do you measure?
  • how do you calculate?
  • how frequently do you report and show trends?
  • how do you link the calculation to something the business understands?
Read more

Chief Data Officers Are A Good Idea -- But How Is That Going To Work?

Gene Leganza

It seems to be popular these days amongst industry pundits to recommend that organizations add a new Cxx role: the Chief Data Officer (CDO). The arguments in favor of this move are exactly what you'd think: the rapidly accelerating importance of information in the enterprise, and, as important, the heightened perception of the importance of information by business executives. The attention on information comes from all the rich new data that simply didn't exist before: sensor data from the Internet Of Things, social media, process data -- really just the enormous volume of data resulting from the digitization of everything. Add to all that: new technology to handle big data in a reasonable time frame, user-friendly mobile computing in the form of tablets, data virtualization software and data warehouse appliances that significantly accelerate the process of getting at the information for analysis, and the promise of predictive analytics, and there's plenty of cause for an information management rennaisance out there. With a little luck, the activity it catalyzes will also improve enterprises' ability to manage the data and content that's not so new but also very important that we've been struggling with for the last decade or so. 

The only argument against creating this role that I've run across is that if CIOs and CTOs did their jobs right, we wouldn't need this new role. That's pretty feeble since we're not just talking about IT's history of relative ineffectiveness in managing information outside of application silos (and don't get me started about content management) -- we're adding to that a significant increase in the value of information and a significant increase in the amount of available information. And then there's the fact that the data could be in the cloud and not managed by IT, and there's also a changing picture regarding risk that suggests a new approach.

Read more

The Great Divide: MDM and Data Quality Solution Selection

Michele Goetz

I just came back from a Product Information Management (PIM) event this week had had a lot of discussions about how to evaluate vendors and their solutions.  I also get a lot of inquiries on vendor selection and while a lot of the questions center around the functionality itself, how to evaluate is also a key point of discussion.  What peaked my interest on this subject is that IT and the Business have very different objectives in selecting a solution for MDM, PIM, and data quality.  In fact, it can often get contentious when IT and the Business don't agree on the best solution. 

General steps to purchase a solution seem pretty consistent: create a short list based on the Forrester Wave and research, conduct an RFI, narrow down to 2-3 vendors for an RFP, make a decision.  But, the devil seems to be in the details.  

  • Is a proof of concept required?
  • How do you make a decision when vendors solutions appear the same? Are they really the same?
  • How do you put pricing into context? Is lowest really better?
  • What is required to know before engaging with vendors to identify fit and differentiation? 
  • When does meeting business objectives win out over fit in IT skills and platform consistency?
Read more

Forrester's Top 15 Emerging Technologies To Watch: Now To 2018

Brian  Hopkins

The pace of technology-fueled business innovation is accelerating, and enterprise architects can take a leading role by helping their firms identify opportunities for shrewd investment. In our 2012 global state of EA online survey, we asked again what the most disruptive technologies would be; here’s what we found:

The results shouldn’t surprise anybody; however, if you are only looking at these, you are likely to get smacked in the face when you blink -- things are changing that fast. In the near future, new platforms built on today’s hot technologies will create more disruption. For example, by 2016 there will be 760 million tablets in use and almost one-third will be sold to business. Forrester currently has a rich body of research on mobility and other hot technologies, such as Forrester’s mobile eBusiness playbook and the CIO’s mobile engagement playbook. But by 2018, mobile will be the norm, so then what?

Read more

Is there a need for a next gen EA Framework?

Henry Peyret

 

There are interesting debates all around the globe about whether there is the need for a next gen EA framework.  James Lapalme recently published an excellent article: Three Schools of Thought on Enterprise Architecture explaining the reasons of such debates.   

In this article James identifies three schools of thoughts for EA, each with their own scope and purpose:

  • "Enterprise IT architecting" which addresses enterprise-wide IT, and the alignment of IT with business.
  • "Enterprise integrating" which addresses the coherency of the enterprise as a system with IT is only one component of the enterprise.
  • "Enterprise Ecological Adaptation" which addresses the enterprise in its larger environment
Read more

5-Years Journey Of TOGAF In China Is Just A Beginning For EA

Charlie Dai

As businesses get larger, and the need for effective alignment of the business with technology capabilities grows, enterprise architecture becomes an essential competency. But in China, many CIOs are struggling with setting up a high-performance enterprise architecture program to support their business strategies in a disruptive market landscape. This seems equally true for state-owned enterprises (SOEs) and multinational companies (MNCs).

To gain a better understanding of the problem, I had an interesting conversation with Le Yao, general secretary of Center for Informatization and Information Management (CIIM) and director of the CIO program at Peking University. Le Yao is one of the first pioneers introducing The Open Group Architecture Framework (TOGAF) into China to help address the above challenges. I believe that the five-year journey of TOGAF in China is just an early beginning for EA, and companies in the China market need relevant EA insights to help them support their business:

  • Taking an EA course is one thing; practicing EA is something else. Companies taking TOGAF courses in China seem to be aiming more at sales enablement than practicing EA internally. MNCs like IBM, Accenture, and HP are more likely to try to infuse the essence of the methodology into their PowerPoint slides for marketing and/or bidding purposes; IBM has also invited channel partners such as Neusoft, Digital China, CS&S, and Asiainfo to take the training.
  • TOGAF is too high-level to be relevant. End user trainees learning the enterprise architecture framework that Yao’s team introduced in China in 2007 found it to be too high-level and conceptual. Also, the trainers only went through what was written in the textbook without using industry-specific cases or practice-related information — making the training less relevant and difficult to apply.
Read more

Judgement Day for Data Quality

Michele Goetz

Joining in on the spirit of all the 2013 predictions, it seems that we shouldn't leave data quality out of the mix.  Data quality may not be as sexy as big data has been this past year.  The technology is mature and reliable.  The concept easy to understand.  It is also one of the few areas in data management that has a recognized and adopted framework to measure success.  (Read Malcolm Chisholm's blog on data quality dimensions) However, maturity shouldn't create complancency. Data quality still matters, a lot.

Yet, judgement day is here and data quality is at a cross roads.  It's maturity in both technology and practice is steeped in an old way of thinking about and managing data.  Data quality technology is firmly seated in the world of data warehousing and ETL.  While still a significant portion of an enterprise data managment landscape, the adoption and use in business critical applications and processes of in-memory, Hadoop, data virtualization, streams, etc means that more and more data is bypassing the traditional platform.

The options to manage data quality are expanding, but not necessarily in a way that ensures that data can be trusted or complies with data policies.  Where data quality tools have provided value is in the ability to have a workbench to centrally monitor, create and manage data quality processes and rules.  They created sanity where ETL spaghetti created chaos and uncertainty.  Today, this value proposition has diminished as data virtualization, Hadoop processes, and data appliances create and persist new data quality silos.  To this, these data quality silos often do not have the monitoring and measurement to govern data.  In the end, do we have data quality?  Or, are we back where we started from?

Read more