Joining in on the spirit of all the 2013 predictions, it seems that we shouldn't leave data quality out of the mix. Data quality may not be as sexy as big data has been this past year. The technology is mature and reliable. The concept easy to understand. It is also one of the few areas in data management that has a recognized and adopted framework to measure success. (Read Malcolm Chisholm's blog on data quality dimensions) However, maturity shouldn't create complancency. Data quality still matters, a lot.
Yet, judgement day is here and data quality is at a cross roads. It's maturity in both technology and practice is steeped in an old way of thinking about and managing data. Data quality technology is firmly seated in the world of data warehousing and ETL. While still a significant portion of an enterprise data managment landscape, the adoption and use in business critical applications and processes of in-memory, Hadoop, data virtualization, streams, etc means that more and more data is bypassing the traditional platform.
The options to manage data quality are expanding, but not necessarily in a way that ensures that data can be trusted or complies with data policies. Where data quality tools have provided value is in the ability to have a workbench to centrally monitor, create and manage data quality processes and rules. They created sanity where ETL spaghetti created chaos and uncertainty. Today, this value proposition has diminished as data virtualization, Hadoop processes, and data appliances create and persist new data quality silos. To this, these data quality silos often do not have the monitoring and measurement to govern data. In the end, do we have data quality? Or, are we back where we started from?
Data management is becoming critical as organizations seek to better understand and target their customers, drive out inefficiency, and satisfy government regulations. Despite this, the maturity of data management practices at companies in China is generally poor.
I had an enlightening conversation with my colleague, senior analyst Michele Goetz, who covers all aspects of data management. She told me that in North America and Europe, data management maturity varies widely from company to company; only about 5% have mature practices and a robust data management infrastructure. Most organizations are still struggling to be agile and lack measurement, even if they already have data management platforms in place. Very few of them align adequately with their specific business or information strategy and organizational structure.
If we look at data management maturity in China, I suspect the results are even worse: that fewer than 1% of the companies are mature in terms of integrated strategy, agile execution and continuous performance measurement. Specifically:
The practice of data management is still in the early stages. Data management is not only about simply deploying technology like data warehousing or related middleware, but also means putting in place the strategy and architectural practice, including contextual services and metadata pattern modeling, to align with business focus. The current focus of Chinese enterprises for data management is mostly around data warehousing, master data management, and basic support for both end-to-end business processes and composite applications for top management decision-making. It’s still far from leveraging the valuable data in business processes and business analytics.
The number one question I get from clients regarding their data strategy and data governance is, “How do I create a business case?”
This question is the kiss of death and here is why.
You created an IT strategy that has placed emphasis on helping to optimize IT data management efforts, lower total cost of ownership and reduce cost, and focused on technical requirements to develop the platform. There may be a nod toward helping the business by highlighting the improvement in data quality, consistency, and management of access and security in broad vague terms. The data strategy ended up looking more like an IT plan to execute data management.
This leaves the business asking, “So what? What is in it for me?”
Rethink your approach and think like the business:
· Change your data strategy to a business strategy. Recognize the strategy, objectives, and capabilities the business is looking for related to key initiatives. Your strategy should create a vision for how data will make these business needs a reality.
· Stop searching for the business case. The business case should already exist based on project requests at a line of business and executive level. Use the input to identify a strategy and solution that supports these requests.
· Avoid “shiny object syndrome”. As you keep up with emerging technology and trends, keep these new solutions and tools in context. There are more data integration, database, data governance, and storage options than ever before and one size does not fit all. Leverage your research to identify the right technology for business capabilities.
There was lots of feedback on the last blog (“Risk Data, Risky Business?”) that clearly indicates the divide between definitions in trust and quality. It is a great jumping off point for the next hot topic, data governance for big data.
The comment I hear most from clients, particularly when discussing big data, is, “Data governance inhibits agility.” Why be hindered by committees and bureaucracy when you want freedom to experiment and discover?
Current thinking: Data governance is freedom from risk.The stakes are high when it comes to data-intensive projects, and having the right alignment between IT and the business is crucial. Data governance has been the gold standard to establish the right roles, responsibilities, processes, and procedures to deliver trusted secure data. Success has been achieved through legislative means by enacting policies and procedures that reduce risk to the business from bad data and bad data management project implementation. Data governance was meant to keep bad things from happening.
Today’s data governance approach is important and certainly has a place in the new world of big data. When data enters the inner sanctum of an organization, management needs to be rigorous.
Yet, the challenge is that legislative data governance by nature is focused on risk avoidance. Often this model is still IT led. This holds progress back as the business may be at the table, but it isn’t bought in. This is evidenced by committee and project management style data governance programs focused on ownership, scope, and timelines. All this management and process takes time and stifles experimentation and growth.
Customer service leaders know that a good customer experience has a quantifiable impact on revenue, as measured by increased rates of repurchase, increased recommendations, and decreased willingness to defect from a brand. They also conceptually understand that clean data is important, but many can’t make the connection between how master data management and data quality investments directly improve customer service metrics. This means that IT initiates data projects more than two-thirds of the time, while data projects that directly affect customer service processes rarely get funded.
What needs to happen is that customer service leaders have to partner with data management pros — often working within IT — to reframe the conversation. Historically, IT organizations would attempt to drive technology investments with the ambiguous goal of “cleaning dirty customer data” within CRM, customer service, and other applications. Instead of this approach, this team must articulate the impact that poor-quality data has on critical business and customer-facing processes.
To do this, start by taking an inventory of the quality of data that is currently available:
Chart the customer service processes that are followed by customer service agents. 80% of customer calls can be attributed to 20% of the issues handled.
Understand what customer, product, order, and past customer interaction data are needed to support these processes.
Although millions of people remain out of work, the economy has clearly thawed and organizations are returning to investing customer-facing business process with a vengeance. Client inquiries and advisory work on CRM topics is going through the roof here at Forrester. Our most recent forecast for global IT purchases of business software anticipates a healthy 9.7% increase in 2010, after brutal decline of 8.0% in 2009. And, Social CRM is all the rage in the blogosphere.
If you are watching the Olympics, you know that the figure skaters spend years practicing to hone their fundamental skills before trying advanced patterns. And, they never stop practicing their elementary figures. My latest report on the key trends driving CRM technology adoption spotlights flawless execution will continue to separate successful CRM initiatives from losers.
We surveyed 58 business and IT professionals to identify the best practices for getting more value from CRM technology projects. These five fundamentals were the keys to success before the economic meltdown — and they remain so today:
The following question comes from many of our clients: what are some of the advantages and risks of implementing a vendor provided analytical logical data model at the start of any Business Intelligence, Data Warehousing or other Information Management initiatives? Some quick thoughts on pros and cons:
Leverage vendor knowledge from prior experience and other customers
May fill in the gaps in enterprise domain knowledge
Best if your IT dept does not have experienced data modelers
May sometimes serve as a project, initiative, solution accelerator
May sometimes break through a stalemate between stakeholders failing to agree on metrics, definitions
May sometimes require more customization effort, than building a model from scratch
May create difference of opinion arguments and potential road blocks from your own experienced data modelers
May reduce competitive advantage of business intelligence and analytics (since competitors may be using the same model)
Goes against “agile” BI principles that call for small, quick, tangible deliverables
Goes against top down performance management design and modeling best practices, where one does not start with a logical data model but rather
Defines departmental, line of business strategies
Links goals and objectives needed to fulfill these strategies
Defines metrics needed to measure the progress against goals and objectives
Defines strategic, tactical and operational decisions that need to be made based on metrics
Consistently rated as one of the most popular features of Forrester Events, one-on-one meetings give you the opportunity to discuss the unique technology issues facing your organization with Forrester analysts. Business & Technology Leadership Forum attendees may schedule up to two 20-minute one-on-one meetings with the Forrester analysts of their choice, depending on availability. Registered attendees will be able to schedule one-on-one meetings starting on Monday September 15, 2008. Book early!