Between 2012 and 2014, mobile BI adoption shot up: Forrester survey data shows that the percentage of technology decision-makers who make some BI applications available on mobile devices has nearly quadrupled, and the percentage who state that BI is delivered exclusively via mobile devices has risen from 1% in 2012 to 7% in 2014. While this clearly demonstrates that mobile BI is gaining traction, the actual mobile BI adoption picture is rather more nuanced. Our ongoing research and client interactions show that mobile BI adopters fall into three overall groups; some organizations
Really ‘get’ the transformational potential of mobile BI. They are the ones who understand that mobile BI is about much more than liberating reports and dashboards from the desktop. They focus on how data can be leveraged to best effect when in the hands of the right person at the right time. If necessary, they’re prepared to change their business processes accordingly. For those companies, mobile BI is an enabler of strategic goals, and deployment is a journey, not an end in itself.
Make mobile BI available because it’s the right thing to do, or they’ve been asked to. Many of these organizations are reaping considerable benefits from their mobile BI implementations, and the more far-sighted of them are working on how to move from the tactical to the strategic. Equally, many are trying to figure out where to go from here, in particular if the initial deployment doesn't show a clear benefit, let alone return on investment.
On one level, IBM’s new z13, announced last Wednesday in New York, is exactly what the mainframe world has been expecting for the last two and a half years – more capacity (a big boost this time around – triple the main memory, more and faster cores, more I/O ports, etc.), a modest boost in price performance, and a very sexy cabinet design (I know it’s not really a major evaluation factor, but I think IBM’s industrial design for its system enclosures for Flex System, Power and the z System is absolutely gorgeous, should be in the MOMA*). IBM indeed delivered against these expectations, plus more. In this case a lot more.
In addition to the required upgrades to fuel the normal mainframe upgrade cycle and its reasonably predictable revenue, IBM has made a bold but rational repositioning of the mainframe as a core platform for the workloads generated by mobile transactions, the most rapidly growing workload across all sectors of the global economy. What makes this positioning rational as opposed to a pipe-dream for IBM is an underlying pattern common to many of these transactions – at some point they access data generated by and stored on a mainframe. By enhancing the economics of the increasingly Linux-centric processing chain that occurs before the call for the mainframe data, IBM hopes to foster the migration of these workloads to the mainframe where its access to the resident data will be more efficient, benefitting from inherently lower latency for data access as well as from access to embedded high-value functions such as accelerators for inline analytics. In essence, IBM hopes to shift the center of gravity for mobile processing toward the mainframe and away from distributed x86 Linux systems that they no longer manufacture.
To compete in today's global economy, businesses and governments need agility and the ability to adapt quickly to change. And what about internal adoption to roll out enterprise-grade Business Intelligence (BI) applications? BI change is ongoing; often, many things change concurrently. One element that too often takes a back seat is the impact of changes on the organization's people. Prosci, an independent research company focused on organizational change management (OCM), has developed benchmarks that propose five areas in which change management needs to do better. They all involve the people side of change: better engage the sponsor; begin organizational change management early in the change process; get employees engaged in change activities; secure sufficient personnel resources; and better communicate with employees. Because BI is not a single application — and often not even a single platform — we recommend adding a sixth area: visibility into BI usage and performance management of BI itself, aka BI on BI. Forrester recommends keeping these six areas top of mind as your organization prepares for any kind of change.
Some strategic business events, like mergers, are high-risk initiatives involving major changes over two or more years; others, such as restructuring, must be implemented in six months. In the case of BI, some changes might need to happen within a few weeks or even days. All changes will lead to either achieving or failing to achieve a business. There are seven major categories of business and organizational change:
As researchers, we can’t underestimate the power of perspective. When the Eiffel Tower was erected 125 years ago, it became the tallest manmade structure in the world and, more importantly, allowed visitors to look down over Paris for the first time; perhaps it was the first real instance of a “birds-eye view.” At the same time, artists like Picasso and Stein were pushing the limits of perspective by portraying every angle of 3-dimensional concepts in one painting or poem. In many ways, the research world today is akin to this historical period of creativity. With more data at our fingertips than ever before, we are able to observe consumer behavior from new vantage points and produce a fresh understanding of customer trends by analyzing multiple angles at the same time.
Here on the data insights innovation team at Forrester, we’ve called our multiperspective research approach Technographics 360. Officially launched this year, Technographics 360 blends Consumer Technographics® survey output, ConsumerVoices Market Research Online Community insight, social listening data, and passive mobile behavioral tracking to synthesize a 360-degree view of consumer behavior. Instead of analyzing research questions by breaking them down, we can synthesize comprehensive solutions by building our knowledge up.
I’ve been talking to a number of users and providers of bare-metal cloud services, and am finding the common threads among the high-profile use cases both interesting individually and starting to connect some dots in terms of common use cases for these service providers who provide the ability to provision and use dedicated physical servers with very similar semantics to the common VM IaaS cloud – servers that can be instantiated at will in the cloud, provisioned with a variety of OS images, be connected to storage and run applications. The differentiation for the customers is in behavior of the resulting images:
Deterministic performance – Your workload is running on a dedicated resource, so there is no question of any “noisy neighbor” problem, or even of sharing resources with otherwise well-behaved neighbors.
Extreme low latency – Like it or not, VMs, even lightweight ones, impose some level of additional latency compared to bare-metal OS images. Where this latency is a factor, bare-metal clouds offer a differentiated alternative.
Raw performance – Under the right conditions, a single bare-metal server can process more work than a collection of VMs, even when their nominal aggregate performance is similar. Benchmarking is always tricky, but several of the bare metal cloud vendors can show some impressive comparative benchmarks to prospective customers.
There is always a tendency to regard the major players in large markets as being a static background against which the froth of smaller companies and the rapid dance of customer innovation plays out. But if we turn our lens toward the major server vendors (who are now also storage and networking as well as software vendors), we see that the relatively flat industry revenues hide almost continuous churn. Turn back the clock slightly more than five years ago, and the market was dominated by three vendors, HP, Dell and IBM. In slightly more than five years, IBM has divested itself of highest velocity portion of its server business, Dell is no longer a public company, Lenovo is now a major player in servers, Cisco has come out of nowhere to mount a serious challenge in the x86 server segment, and HP has announced that it intends to split itself into two companies.
And it hasn’t stopped. Two recent events, the fracturing of the VCE consortium and the formerly unthinkable hook-up of IBM and Cisco illustrate the urgency with which existing players are seeking differential advantage, and reinforce our contention that the whole segment of converged and integrated infrastructure remains one of the active and profitable segments of the industry.
EMC’s recent acquisition of Cisco’s interest in VCE effectively acknowledged what most customers have been telling us for a long time – that VCE had become essentially an EMC-driven sales vehicle to sell storage, supported by VMware (owned by EMC) and Cisco as a systems platform. EMC’s purchase of Cisco’s interest also tacitly acknowledges two underlying tensions in the converged infrastructure space:
By now you have at least seen the cute little elephant logo or you may have spent serious time with the basic components of Hadoop like HDFS, MapReduce, Hive, Pig and most recently YARN. But do you have a handle on Kafka, Rhino, Sentry, Impala, Oozie, Spark, Storm, Tez… Giraph? Do you need a Zookeeper? Apache has one of those too! For example, the latest version of Hortonworks Data Platform has over 20 Apache packages and reflects the chaos of the open source ecosystem. Cloudera, MapR, Pivotal, Microsoft and IBM all have their own products and open source additions while supporting various combinations of the Apache projects.
After hearing the confusion between Spark and Hadoop one too many times, I was inspired to write a report, The Hadoop Ecosystem Overview, Q4 2104. For those that have day jobs that don’t include constantly tracking Hadoop evolution, I dove in and worked with Hadoop vendors and trusted consultants to create a framework. We divided the complex Hadoop ecosystem into a core set of tools that all work closely with data stored in Hadoop File System and extended group of components that leverage but do not require it.
In the past, enterprise architects could afford to think big picture and that meant treating Hadoop as a single package of tools. Not any more – you need to understand the details to keep up in the age of the customer. Use our framework to help, but please read the report if you can as I include a lot more there.
Digital transformation will drive technology spending growth of 4.9%.Always-connected, technology-empowered customers are redefining sources of competitive advantage for AP organizations. In fact, 79% of business and technology decision-makers that Forrester surveyed indicated that improving the experience of technology-empowered customers will be a high or critical priority for their business in 2015. Similarly, 57% said that meeting consumers’ rising expectations was one of the reasons that they would spend more money on technology next year — the top reported reason for increased technology spending
An inquiry call from a digital strategy agency advising a client of theirs on data commercialization generated a lively discussion on strategies for taking data to market. With few best practices out there, the emerging opportunity just might feel like space exploration – going boldly where no man has gone before. The question is increasingly common. "We know we have data that would be of use to others but how do we know? And, which use cases should we pursue?" In It's Time To Take Your Data To Market published earlier this fall, my colleagues and I provided some guideance on identifying and commercializing that "Picasso in the attic." But the ideas around how to go-to-market continue to evolve.
In answer to the inquiry questions asked the other day, my advice was pretty simple: Don’t try to anticipate all possible uses of the data. Get started by making selected data sets available for people to play with, see what it can do, and talk about it to spread the word. However, there are some specific use cases that can kick-start the process.
Look to your existing customers.
The grass is not always greener, and your existing clients might just provide some fertile ground. A couple thoughts on ways your existing customers could use new data sources: