Kofax continues its acquisition rampage with a cash purchase of Kapow. I came across Kofax a few years ago while doing the research for "Take A Process View Of Content Integration." Apparently Kofax has taken the "process view." The idea behind that piece was that enterprises had so many diverse content stores that they needed to view conversion and migration of unstructured content as an internal competency.
But while content integration can reduce infrastructure costs and license fees, the real value is from improving business processes by linking content to business process management (BPM) and dynamic case management systems to reduce cycle time and improve compliance, customer support, and decision-making. These projects can be complex, difficult, and challenging, but Kofax correctly sees this as a large opportunity. I do as well.
Another Kapow capability is to scrape websites and create consolidated views. For example,customer service reps often switch between apps in a clumsy and inefficient manner while the customer is on hold. In some cases, ECI software should grab the needed content behind the scenes and present it in a unified way. Kapow Technologies' content integration solution works like a robot to extract, transform, and load content from Web-based apps to consolidated views. I interviewed one large telecommunications company that used Kapow's robot for customer service business processes to eliminate task switching and repetitive tasks. According to the company:
Sometimes getting the data quality right is just hard, if not impossible. Even after implementing data quality tools, acquiring third-party data feeds, and implementing data steward remediation processes, often the business is still not satisfied with the quality of the data. Data is still missing and considered old or irrelevant. For example: Insurance companies want access to construction data to improve catastrophe modeling. Food chains need to incorporate drop-off bays and instructions for outlets in shopping malls and plazas to get food supplies to the prep tables. Global companies need to validate address information in developing countries that have incomplete or fast-changing postal directories for logistics. What it takes to complete the data and improve it has now entered the realm of hands-on processes.
Crowdflower says they have the answer to the data challenges listed above. It has a model of combining a crowdsourcing model and data stewardship platform to manage the last mile in data quality. The crowd is a vast network of people around the globe that are notified of data quality tasks through a data stewardship platform. If they can help with the data quality need within the time period requester, the contributor accepts the task and get to work. The crowd can use all resources and channels available to them to complete tasks such as web searches, visits, and phone inquiries. Quality control is performed to validate crowdsourced data and improvements. If an organization has more data quality tasks, machine learning is applied to analyze and optimize crowd sourcing based on the scores and results of contributors.
With the employer mandate delays being the latest setback to U.S. president Obama's push for national healthcare, it's worth looking at how other countries are successfully tackling the same problem. The United Kingdom has had nationalized healthcare for years, and one of the things that makes this effort so successful is its approach to data collaboration — something Forrester calls Adaptive Intelligence.
While the UK hasn't successfully moved into fully electronic health records, it has in place today a health records sharing system that lets its over 27,000 member organizations string together patient care information across providers, hospitals, and ministries, creating a more full and accurate picture of each patient, which results in better care. At the heart of this exchange is a central data sharing system called Spine. It's through Spine that all the National Health Service (NHS) member organizations connect their data sets for integration and analysis. The data-sharing model Spine creates has been integral in the creation of summary care records across providers, an electronic prescription service, and highly detailed patient care quality analysis. As we discussed in the Forrester report "Introducing Adaptive Intelligence," no one company can alone create an accurate picture of its customers or its business without collaborating on the data and analysis with other organizations who have complementary views that flesh out the picture.
The other day, I had one of those eureka-like moments. As I lay in the bath, my thoughts shifted back and forth between the past and the present, recognizing how advances (or the lack of advances) in technology have affected our lives. When thinking about the past, I remember the days of my communication engineering apprenticeship; this was in the days of electro-mechanical exchanges. Some of you may remember or may have seen, in an old film, a telephone operator connecting two phone lines by placing a connecting cord between two phone line jacks. This was the world of telecommunication exchanges in the 1970s — no fancy computing technology existed in telecommunications at the time. In was certainly not a trivial exercise in upgrading capacity, maintaining the exchange, or connecting to another exchange. When thinking about the present, I marvel at the continuing improvements in plug-and-play hardware and software technology. As an example, I buy a new camera and, hey presto! I now have the ability to edit and post pictures on forums or cloud applications, to send them by email, or to store them on third-party storage from my camera.
So back to my eureka-like moment. I’m thinking that, surely, all these present-day technology advances have been enabled because of standards, design patterns, and common interfaces. My mind keeps focusing on design patterns, and the question arises: "Is there such a thing as business design patterns?" I have done some initial research, and I am yet to find evidence of the term or concept of business design patterns. However, I do have my suspicions they exist because:
I had a conversation recently with Brian Lent, founder, chairman, and CTO of Medio. If you don’t know Brian, he has worked with companies such as Google and Amazon to build and hone their algorithms and is currently taking predictive analytics to mobile engagement. The perspective he brings as a data scientist not only has ramifications for big data analytics, but drastically shifts the paradigm for how we architect our master data and ensure quality.
We discussed big data analytics in the context of behavior and engagement. Think shopping carts and search. At the core, analytics is about the “closed loop.” It is, as Brian says, a rinse and repeat cycle. You gain insight for relevant engagement with a customer, you engage, then you take the results of that engagement and put them back into the analysis.
Sounds simple, but think about what that means for data management. Brian provided two principles:
For the past ten years, the major IT initiative within Chinese organizations has been service oriented and/or process driven architecture. The pace of change has been slow for two reasons: 1) From an end user perspective, related business requirements are not clear or of high priority; 2) more importantly, solutions providers have not been ready to embrace technology innovation and meet emerging technology requirements through new business models.
Times are changing. IBM and other major ISV/SI in China (as well as end users) are driving momentum around emerging technology, such as cloud and enterprise mobility. I recently attended the IBM Technical Summit 2013 in Beijing from July 11 to 12. Here’s what I learned:
Telecom carriers supported by technology vendors will accelerate cloud adoption by SME. Contributing to more than 60% of total GDP in China, small and medium enterprises (SMEs) have always sought to simplify their IT operation as much as possible, and at the same time scale it up when business expands as quickly as possible. IaaS solutions appear to be a perfect match for SMEs; however IT professionals have concerns about the security and data privacy over the operations by other companies.
How is it possible for a local company to defeat global giants like Pepsi, Coca-Cola, and Watsons in your market segment and establish market leadership for more than a decade? The answer is given by Nongfu Spring, a Chinese company in manufacturing and retail industries. In my recent report “Case Study: Technology Innovation Enables Nongfu Spring To Strengthen Market Leadership”, I analyzed the key factors behind their success, and provide related best practice from enterprise architecture perspective. These factors include
Business strategy is enterprise architecture's top priority. EA pros often need to be involved in project-level IT activities to resolve issues and help IT teams put out fires. But it's much more important that architects have a vision, clearly understand the business strategy, and thoroughly consider the appropriate road map that will support it in order to be able to address the root causes of challenges.
Agile infrastructure sets up the foundation for scalable business growth. Infrastructure scalability is the basis of business scalability. Infrastructure experts should consider not only the agility that virtualization and IaaS solutions will provide next-generation infrastructure, but also network-level load balancing among multiple telecom carriers. They should also refine the network topology for enterprise security.
It is easy to get caught up in the source and target paradigm when implementing master data management. The logical model looms large to identify where master data resides for linkage and makes the project -- well -- logical.
If this is the first step in your customer MDM endeavor and creating a master data definition based on identifying relevant data elements, STOP!
The first step is to articulate the story that customer MDM will support. This is the customer MDM blueprint.
For example, if the driving business strategy is to create a winning customer experience, customer MDM puts the customer definition at the center of what the customer experience looks like. The customer experience is the story. You need to understand and have data points for elements such as preferences, sentiment, lifestyle, and friends/relationships. These elements may be available within your CRM system, in social networks, with partners, and third-party data providers. The elements may be discrete or derived from analytics. If you only look for name, address, phone, and email, there is nothing about this definition that helps determine how you place that contact into context of engagement.
Ultimately, isn’t that what the business is asking for when they want the promised 360-degree view of the customer? Demands for complete, relevant, and timely are not grounded in the databases, data dictionaries, and integration/transformation processes of your warehouses and applications; they are grounded in the story.
So, don’t start with the data. Start with the story you want to tell.
I'm really pleased to be working with ARMA International -- the not-for-profit industry association representing professionals in the fields of information management, records management, compliance, and library/archives. This is the fifth year that Forrester and ARMA have jointly developed a survey to take the pulse of the profession. We're interested in understanding the top challenges, trends, buying patterns, and professional development issues in this space. As the practice of records management evolves into "information governance" and a digital-first perspective, data and insights are needed to help individuals and solution providers to make this transition.
We're interested in tracking how this market has evolved over the past five years. How are records and information managers coping with rising cloud adoption? How closely is the alignment with IT decision-makers when budgets are planned and software is acquired? How are organizations keeping pace with social media records? Can't wait to see the year-over-year trends!
If you are the decision-maker for records management initatives in your organization, please help us by taking this survey before end of day Friday, July 12. If you're not the right person -- please pass along the link to your colleagues that are!
I’ve been presenting research on big data and data governance for the past several months where I show a slide of a businesswoman doing a backbend to access data in her laptop. The point I make is that data management has to be hyper-flexible to meet a wider range of analytic and consumption demands than ever before. Translated, you need to cross-train for data management to have cross-fit data.
The challenge is that traditional data management takes a one-size fits-all approach. Data systems are purpose built. If organizations want to reuse a finance warehouse for marketing and sales purposes, it often isn’t a match and a new warehouse is built. If you want to get out of this cycle and go from data couch potato to data athlete, a cross-fit data training program should focus on:
Context first. Understanding how data is used and will provide value drives platform design. Context indicates more than where data is sourced from and where it will be delivered. Context answers: operations or analytics, structured or unstructured, persistent or disposable? These guide decisions around performance, scale, sourcing, cost, and governance.
Data governance zones. Command and control data governance creates a culture of “no” that stifles innovation and can cause the business to go around IT for data needs. The solution is to create policies and processes that give permission as well as mitigate risk. Loosen quality and security standards in projects and scenarios that are in contained environments. Tighten rules and create gates when called for by regulation, where there are ethical conflicts, or when data quality or access exposes the business to significant financial risk.