Information Fabric 3.0 Delivers The Next Generation Of Data Virtualization

Noel Yuhanna

For decades, firms have deployed applications and BI on independent databases and warehouses, supporting custom data models, scalability, and performance while speeding delivery. It’s become a nightmare to try to integrate the proliferation of data across these sources in order to deliver the unified view of business data required to support new business applications, analytics, and real-time insights. The explosion of new sources, driven by the triple-threat trends of mobile, social, and the cloud, amplified by partner data, market feeds, and machine-generated data, further aggravates the problem. Poorly integrated business data often leads to poor business decisions, reduces customer satisfaction and competitive advantage, and slows product innovation — ultimately limiting revenue.

Forrester’s latest research reveals how leading firms are coping with this explosion using data virtualization, leading us to release a major new version of our reference architecture, Information Fabric 3.0. Since Forrester invented the category of data virtualization eight years ago with the first version of information fabric, these solutions have continued to evolve. In this update, we reflect new business requirements and new technology options including big data, cloud, mobile, distributed in-memory caching, and dynamic services. Use information fabric 3.0 to inform and guide your data virtualization and integration strategy, especially where you require real-time data sharing, complex business transactions, more self-service access to data, integration of all types of data, and increased support for analytics and predictive analytics.

Information fabric 3.0 reflects significant innovation in data virtualization solutions, including:

Read more

Q&A With Greg Swimer, VP IT, Business Intelligence, Unilever

Kyle McNabb

In advance of next week’s Forrester’s European Business Technology Forums in London on June 10 and 11, we had an opportunity to speak with Greg Swimer about information management and how Unilever delivers real-time data to its employeesGreg Swimer is a global IT leader at Unilever, responsible for delivering new information management, business intelligence, reporting, consolidation, analytics, and master data solutions to more than 20,000 users across all of Unilever’s businesses globally.
 

1) What are the two forces you and the Unilever team are balancing with your “Data At Your Fingertips” vision?

Putting the data at Unilever’s fingertips means working on two complementary aspects of information management. One aspect is to build an analytics powerhouse with the capacity to handle big data, providing users with the technological power to analyse that data in order to gain greater insight and drive better decision-making. The other aspect is the importance of simplifying and standardizing that data so that it’s accessible enough to understand and act upon. We want to create a simplified landscape, one that allows better decisions, in real time, where there is a common language and a great experience for users.

 

2) What keys to success have you uncovered in your efforts?

Read more

Why Maturity Models For Data Governance Are Irrelevant In The Data Economy

Henry Peyret

There are multiple maturity models and associated assessments for data governance on the market. Some are from software vendors, or from consulting companies, which use these as the basis for selling services. Others are from professional groups like the one from the Data Governance Council.

They are all good – but frankly not adequate for the data economy many companies are entering into. I think it is useful to reshuffle some too well established ideas...

Maturity models in general are attractive because:

  • Using a maturity model is nearly a “no-brainer” exercise. You run an assessment and determine your current maturity level. Then you can make a list of the actions which will drive you to the next level. You do not need to ask your business for advice, nor involve too many people for interviews.
  • Most data governance maturity models are modeled on the very well known CMMI. That means that they are similar at least in terms of structure/levels. So the debate between the advantages of one vs another is limited to its level of detail.
Read more

How Bad Are Firms In China At Data Management?

Charlie Dai

Data management is becoming critical as organizations seek to better understand and target their customers, drive out inefficiency, and satisfy government regulations. Despite this, the maturity of data management practices at companies in China is generally poor.

I had an enlightening conversation with my colleague, senior analyst Michele Goetz, who covers all aspects of data management. She told me that in North America and Europe, data management maturity varies widely from company to company; only about 5% have mature practices and a robust data management infrastructure. Most organizations are still struggling to be agile and lack measurement, even if they already have data management platforms in place. Very few of them align adequately with their specific business or information strategy and organizational structure.

If we look at data management maturity in China, I suspect the results are even worse: that fewer than 1% of the companies are mature in terms of integrated strategy, agile execution and continuous performance measurement. Specifically:

  • The practice of data management is still in the early stages. Data management is not only about simply deploying technology like data warehousing or related middleware, but also means putting in place the strategy and architectural practice, including contextual services and metadata pattern modeling, to align with business focus. The current focus of Chinese enterprises for data management is mostly around data warehousing, master data management, and basic support for both end-to-end business processes and composite applications for top management decision-making. It’s still far from leveraging the valuable data in business processes and business analytics.
Read more

How To Partner With Data Quality Pros To Deliver Better Customer Service Experiences

Kate Leggett

Customer service leaders know that a good customer experience has a quantifiable impact on revenue, as measured by increased rates of repurchase, increased recommendations, and decreased willingness to defect from a brand. They also conceptually understand that clean data is important, but many can’t make the connection between how master data management and data quality investments directly improve customer service metrics. This means that IT initiates data projects more than two-thirds of the time, while data projects that directly affect customer service processes rarely get funded.

 What needs to happen is that customer service leaders have to partner with data management pros — often working within IT — to reframe the conversation. Historically, IT organizations would attempt to drive technology investments with the ambiguous goal of “cleaning dirty customer data” within CRM, customer service, and other applications. Instead of this approach, this team must articulate the impact that poor-quality data has on critical business and customer-facing processes.

To do this, start by taking an inventory of the quality of data that is currently available:

  • Chart the customer service processes that are followed by customer service agents. 80% of customer calls can be attributed to 20% of the issues handled.
  • Understand what customer, product, order, and past customer interaction data are needed to support these processes.
Read more

Pros and cons of using a vendor provided analytical data model in your BI implementation

Boris Evelson

The following question comes from many of our clients: what are some of the advantages and risks of implementing a vendor provided analytical logical data model at the start of any Business Intelligence, Data Warehousing or other Information Management initiatives? Some quick thoughts on pros and cons:

Pros:

  • Leverage vendor knowledge from prior experience and other customers
  • May fill in the gaps in enterprise domain knowledge
  • Best if your IT dept does not have experienced data modelers 
  • May sometimes serve as a project, initiative, solution accelerator
  • May sometimes break through a stalemate between stakeholders failing to agree on metrics, definitions

Cons

 

  • May sometimes require more customization effort, than building a model from scratch
  • May create difference of opinion arguments and potential road blocks from your own experienced data modelers
  • May reduce competitive advantage of business intelligence and analytics (since competitors may be using the same model)
  • Goes against “agile” BI principles that call for small, quick, tangible deliverables
  • Goes against top down performance management design and modeling best practices, where one does not start with a logical data model but rather
    • Defines departmental, line of business strategies  
    • Links goals and objectives needed to fulfill these strategies  
    • Defines metrics needed to measure the progress against goals and objectives  
    • Defines strategic, tactical and operational decisions that need to be made based on metrics
Read more

Meet One-On-One With Forrester Analysts At Our Business & Technology Leadership Forum 2008

Sharyn Leaver

Consistently rated as one of the most popular features of Forrester Events, one-on-one meetings give you the opportunity to discuss the unique technology issues facing your organization with Forrester analysts. Business & Technology Leadership Forum attendees may schedule up to two 20-minute one-on-one meetings with the Forrester analysts of their choice, depending on availability. Registered attendees will be able to schedule one-on-one meetings starting on Monday September 15, 2008. Book early!

Read more