Retaining and delighting empowered customers requires continuous, technology-enabled innovation and improved customer insight (CI). The logic is simple in theory, but that doesn’t make it any easier to implement in practice.
In my recent report, entitled “Applying Customer Insight To Your Digital Strategy”, I highlight the top lessons learned from organizations in Asia Pacific (AP) that are successfully leveraging CI to fuel digital initiatives. It all starts by ensuring that data-driven decision-making is central to the digital strategy. With that in mind, I want to use this blog post to focus on two key lessons from the report:
Lesson One: Establish A Clear Mandate To Invest In Customer Analytics
Successful companies serve empowered customers in the way they want to be served, not the way the company wants to serve them. When building a mandate you should:
■ Expect natural tensions between various business stakeholders to arise. To secure buy-in from senior business decision-makers, start by illustrating the clear link between digital capabilities and data as a source of improved customer understanding. Identify measurable objectives and then link them to three to four scenarios that highlight where the biggest opportunities and risks exist. Continue to justify data-related investments by restating these scenarios at regular intervals.
Intel has made no secret of its development of the Xeon D, an SOC product designed to take Xeon processing close to power levels and product niches currently occupied by its lower-power and lower performance Atom line, and where emerging competition from ARM is more viable.
The new Xeon D-1500 is clear evidence that Intel “gets it” as far as platforms for hyperscale computing and other throughput per Watt and density-sensitive workloads, both in the enterprise and in the cloud are concerned. The D1500 breaks new ground in several areas:
It is the first Xeon SOC, combining 4 or 8 Xeon cores with embedded I/O including SATA, PCIe and multiple 10 nd 1 Gb Ethernet ports.
It is the first of Intel’s 14 nm server chips expected to be introduced this year. This expected process shrink will also deliver a further performance and performance per Watt across the entire line of entry through mid-range server parts this year.
Why is this significant?
With the D-1500, Intel effectively draws a very deep line in the sand for emerging ARM technology as well as for AMD. The D1500, with 20W – 45W power, delivers the lower end of Xeon performance at power and density levels previously associated with Atom, and close enough to what is expected from the newer generation of higher performance ARM chips to once again call into question the viability of ARM on a pure performance and efficiency basis. While ARM implementations with embedded accelerators such as DSPs may still be attractive in selected workloads, the availability of a mainstream x86 option at these power levels may blunt the pace of ARM design wins both for general-purpose servers as well as embedded designs, notably for storage systems.
The business has an insatiable appetite for data and insights. Even in the age of big data, the number one issue of business stakeholders and analysts is getting access to the data. If access is achieved, the next step is "wrangling" the data into a usable data set for analysis. The term "wrangling" itself creates a nervous twitch, unless you enjoy the rodeo. But, the goal of the business isn't to be an adrenalin junky. The goal is to get insight that helps them smartly navigate through increasingly complex business landscapes and customer interactions. Those that get this have introduced a softer term, "blending." Another term dreamed up by data vendor marketers to avoid the dreaded conversation of data integration and data governance.
The reality is that you can't market message your way out of the fundamental problem that big data is creating data swamps even in the best intentioned efforts. (This is the reality of big data's first principle of a schema-less data.) Data governance for big data is primarily relegated to cataloging data and its lineage which serve the data management team but creates a new kind of nightmare for analysts and data scientist - working with a card catalog that will rival the Library of Congress. Dropping a self-service business intelligence tool or advanced analytic solution doesn't solve the problem of familiarizing the analyst with the data. Analysts will still spend up to 80% of their time just trying to create the data set to draw insights.
There's never been a question on the advantages of open source software. Crowdsourcing, vendor independence, ability to see and in some cases control the source code, and lower costs are just a few benefits of open source software (OSS) and business model. Linux and Apache Hadoop are prime examples of successful OSS projects. It's a different story, however, when it comes to OSS BI. For years, OSS BI vendors struggled with growth because of:
The developer-centric nature of open source projects. The target audience for open source projects is developers, which means deals are mostly sealed by technology management. The industry, on the other hand, has gravitated toward business decision-makers within organizations over the last several years. However, business users are less interested in the opportunities that a collaborative open source community offers, and more concerned about ease of use and quick setup. Indeed, Forrester's research constantly finds evidence correlating business ownership as one of the key success factors for effective BI initiatives.
The battle of trying to apply traditional waterfall software development life-cycle (SDLC) methodology and project management to BI has already been fought — and largely lost. These approaches and best practices, which apply to most other enterprise applications, work well in some cases, as with very well-defined and stable BI capabilities like tax or regulatory reporting. Mission-critical, enterprise-grade BI apps can also have a reasonably long shelf life of a year or more. But these best practices do not work for the majority (anecdotally, about three-quarters) of BI initiatives, where requirements change much faster than these traditional approaches can support; by the time a traditional BI application development team rolls out what it thought was a well-designed BI application, it's too late. As a result, BI pros need to move beyond earlier-generation BI support organizations to:
Focus on business outcomes, not just technologies. Earlier-generation BI programs lacked an "outcomes first" mentality. Those programs employed bottom-up approaches that focused on the project management and technology first, leaving clients without the proper outcomes that they needed to manage the business; in other words, they created an insights-to-action gap. BI pros should use a top-down approach that defines key performance indicators, metrics, and measures that support the business' goals and objectives. They must resist the temptation to address technology and data needs before the business requirements.
When you hear the term fast data the first thought is probably the velocity of the data. Not unusual in the realm of big data where velocity is one of the V's everyone talked about. However, fast data encompasses more than a data characteristic, it is about how quickly you can get and use insight.
Working with Noel Yuhanna on an upcoming report on how to develop your data management roadmap, we found speed was a continuous theme to achieve. Clients consistently call out speed as what holds them back. How they interpret what speed means is the crux of the issue.
Technology management thinks about how quickly data is provisioned. The solution is a faster engine - in-memory grids like SAP HANA become the tool of choice. This is the wrong way to think about it. Simply serving up data with faster integration and a high performance platform is what we have always done - better box, better integration software, better data warehouse. Why use the same solution that in a year or two runs against the same wall?
The other side of the equation is that sending data out faster ignores what business stakeholders and analytics teams want. Speed to the business encompasses self-service data acquisition, faster deployment of data services, and faster changes. The reason, they need to act on the data and insights.
The right strategy is to create a vision that orients toward business outcomes. Today's reality is that we live in a world where it is no longer about first to market, we have to be about first to value. First to value with our customers, and first to value with our business capabilities. The speed at which insights are gained and ultimately how they are put to use is your data management strategy.
Last year I published a reasonably well-received research document on Hadoop infrastructure, “Building the Foundations for Customer Insight: Hadoop Infrastructure Architecture”. Now, less than a year later it’s looking obsolete, not so much because it was wrong for traditional (and yes, it does seem funny to use a word like “traditional” to describe a technology that itself is still rapidly evolving and only in mainstream use for a handful of years) Hadoop, but because the universe of analytics technology and tools has been evolving at light-speed.
If your analytics are anchored by Hadoop and its underlying map reduce processing, then the mainstream architecture described in the document, that of clusters of servers each with their own compute and storage, may still be appropriate. On the other hand, if, like many enterprises, you are adding additional analysis tools such as NoSQL databases, SQL on Hadoop (Impala, Stinger, Vertica) and particularly Spark, an in-memory-based analytics technology that is well suited for real-time and streaming data, it may be necessary to begin reassessing the supporting infrastructure in order to build something that can continue to support Hadoop as well as cater to the differing access patterns of other tools sets. This need to rethink the underlying analytics plumbing was brought home by a recent demonstration by HP of a reference architecture for analytics, publicly referred to as the HP Big Data Reference Architecture.
Between 2012 and 2014, mobile BI adoption shot up: Forrester survey data shows that the percentage of technology decision-makers who make some BI applications available on mobile devices has nearly quadrupled, and the percentage who state that BI is delivered exclusively via mobile devices has risen from 1% in 2012 to 7% in 2014. While this clearly demonstrates that mobile BI is gaining traction, the actual mobile BI adoption picture is rather more nuanced. Our ongoing research and client interactions show that mobile BI adopters fall into three overall groups; some organizations
Really ‘get’ the transformational potential of mobile BI. They are the ones who understand that mobile BI is about much more than liberating reports and dashboards from the desktop. They focus on how data can be leveraged to best effect when in the hands of the right person at the right time. If necessary, they’re prepared to change their business processes accordingly. For those companies, mobile BI is an enabler of strategic goals, and deployment is a journey, not an end in itself.
Make mobile BI available because it’s the right thing to do, or they’ve been asked to. Many of these organizations are reaping considerable benefits from their mobile BI implementations, and the more far-sighted of them are working on how to move from the tactical to the strategic. Equally, many are trying to figure out where to go from here, in particular if the initial deployment doesn't show a clear benefit, let alone return on investment.
On one level, IBM’s new z13, announced last Wednesday in New York, is exactly what the mainframe world has been expecting for the last two and a half years – more capacity (a big boost this time around – triple the main memory, more and faster cores, more I/O ports, etc.), a modest boost in price performance, and a very sexy cabinet design (I know it’s not really a major evaluation factor, but I think IBM’s industrial design for its system enclosures for Flex System, Power and the z System is absolutely gorgeous, should be in the MOMA*). IBM indeed delivered against these expectations, plus more. In this case a lot more.
In addition to the required upgrades to fuel the normal mainframe upgrade cycle and its reasonably predictable revenue, IBM has made a bold but rational repositioning of the mainframe as a core platform for the workloads generated by mobile transactions, the most rapidly growing workload across all sectors of the global economy. What makes this positioning rational as opposed to a pipe-dream for IBM is an underlying pattern common to many of these transactions – at some point they access data generated by and stored on a mainframe. By enhancing the economics of the increasingly Linux-centric processing chain that occurs before the call for the mainframe data, IBM hopes to foster the migration of these workloads to the mainframe where its access to the resident data will be more efficient, benefitting from inherently lower latency for data access as well as from access to embedded high-value functions such as accelerators for inline analytics. In essence, IBM hopes to shift the center of gravity for mobile processing toward the mainframe and away from distributed x86 Linux systems that they no longer manufacture.
To compete in today's global economy, businesses and governments need agility and the ability to adapt quickly to change. And what about internal adoption to roll out enterprise-grade Business Intelligence (BI) applications? BI change is ongoing; often, many things change concurrently. One element that too often takes a back seat is the impact of changes on the organization's people. Prosci, an independent research company focused on organizational change management (OCM), has developed benchmarks that propose five areas in which change management needs to do better. They all involve the people side of change: better engage the sponsor; begin organizational change management early in the change process; get employees engaged in change activities; secure sufficient personnel resources; and better communicate with employees. Because BI is not a single application — and often not even a single platform — we recommend adding a sixth area: visibility into BI usage and performance management of BI itself, aka BI on BI. Forrester recommends keeping these six areas top of mind as your organization prepares for any kind of change.
Some strategic business events, like mergers, are high-risk initiatives involving major changes over two or more years; others, such as restructuring, must be implemented in six months. In the case of BI, some changes might need to happen within a few weeks or even days. All changes will lead to either achieving or failing to achieve a business. There are seven major categories of business and organizational change: