Big data is undergoing big change, but most companies are missing it or just grasping at the edges. My colleague Fatemeh Khatibloo and I have just completed an exhaustive study of the big data phenomenon. We found a familiar pattern: business confusion in the face of stern warnings about the dangers of big data and vendor-sponsored papers extolling its benefits. Here’s what we found hidden beneath the buzz:
As data explodes, so do old ways of doing business.
Everywhere we look, we find businesses using more diverse, messier, and larger data sets to stay competitive in the age of the customer — like the consumer goods firm that allocated marketing dollars based on flu trend predictions and the oil and gas companies that used weather data to predict iceberg flows and extend their drilling season. Savvy businesses find ways to turn more data into a competitive advantage. If your firm doesn’t get this, it won’t be pretty — starting in the not too distant future.
Technology managers and architects can’t afford to sit back and think that their Hadoop project will deliver everything the business needs. Nor can you afford to think that big data isn’t for you because you don’t have that much data. Why? Because “big data” is really the practices and technologies that close the gap between the available data and the ability to turn that data into business insight — insight that your firm needs to survive and thrive in the age of the customer. Four things to understand:
BI is no longer a nice-to-have back-office application that counts widgets — it is now used as a key competitive differentiator by all leading organizations. For decades, most of the BI business cases were based on intangible benefits, but these days are over — today 41% of professionals, with knowledge of their firm's business case, base their business case on tangible benefits, like an increased margin or profitability. As a result, BI is front and center of most enterprise agendas, with North American data and analytics technology decision-makers who know their firm's technology budget telling Forrester in 2014 that 15% of their technology management budget will go toward BI-related purchases, initiatives, and projects.
But taking advantage of this trend by deploying a single centralized BI platform is easier said than done at most organizations. Legacy platforms, mergers and acquisitions (M&A), BI embedded into enterprise resource planning (ERP) applications, and organizational silos are just a few reasons why no large organization out there has a single enterprise BI platform. Anecdotal evidence shows that most enterprises have three or more enterprise BI platforms and many more shadow IT BI platforms.
But Avoid Ending Up With A Zoo Of Individual Big Data Solutions
We are beyond the point of struggling over the definition of big data. That doesn’t mean that we've resolved all of the confusion that surrounds the term, but companies today are instead struggling with the question of how to actually get started with big data.
28% of all companies are planning a big data project in 2014.
According to Forrester's Business Technographics™ Global Data And Analytics Survey, 2014, 28% of the more than 1600 responding companies globally are planning a Big Data project this year. More details and how this splits between IT and Business driven projects can be found in our new Forrester Report ‘Reset On Big Data’.
Or join our Forrester Forum For Technology Leaders in London, June 12&13, 2014 to hear and discuss with us directly what Big Data projects your peers are planning, what challenges they are facing and what goals they target to achieve.
On May 14, Acxiom announced its intention to acquire LiveRamp, a "data onboarding service," to the tune of $310 million in cash. Several Forrester analysts (Fatemeh Khatibloo, Susan Bidel, Sri Sridharan, and I) cover these two firms, and what follows is our collective thinking on the impending acquisition after having been briefed by Acxiom's leadership on the matter.
“Business Intelligence in the cloud? You’ve got to be joking!” That’s the response I got when I recently asked a client whether they’d considered availing themselves of a software-as-a-service (SaaS) solution to meet a particular BI need. Well, I wasn’t joking. There are many scenarios when it makes sense to turn to the cloud for a BI solution, and increasing numbers of organizations are indeed doing so. Indications are also that companies are taking a pragmatic approach to cloud BI, headlines to the contrary notwithstanding. Forrester has found that:
· Less than one third of organizations have no plans for cloud BI. When we asked respondents in our Forrsights Software Survey Q4 2013 whether they were using SaaS BI in the cloud, or were intending to do so, not even one third declared that they had no plans. Of the rest, 34% were already using cloud BI, and 31% had cloud in their BI plans for the next two years. But it’s not a case of either/or: the majority of those who’ve either already adopted cloud BI or are intending to do so are using the SaaS system to complement their existing BI and analytics capabilities. Still, it’s worth noting that 12% of survey respondents had already replaced most or all or their existing BI systems with SaaS, and a further 16% were intending to do so.
No self-respecting EA professional would enter into planning discussions with business or tech management execs without a solid grasp of the technologies available to the enterprise, right? But what about the data available to the enterprise? Given the shift towards data-driven decision-making and the clear advantages from advanced analytics capabilities, architecture professionals should be coming to the planning table with not only an understanding of enterprise data, but a working knowledge of the available third-party data that could have significant impact on your approach to customer engagement or your B2B partner strategy.
Data discussions can't be simply about internal information flow, master data, and business glossaries any more. Enterprise architects, business architects, and information architects working with business execs on tech-enabled strategies need to bring third-party data know-how to their brainstorming and planning discussions. As the data economy is still in its relatively early stages and, more to the point, as organizational responsibilities for sourcing, managing, and governing third-party data are still in their formative states, it behooves architects to take the lead in understanding the data economy in some detail. By doing so, architects can help their organizations find innovative approaches to data and analytics that have direct business impact by improving the customer experience, making your partner ecosystem more effective, or finding new revenue from data-driven products.
An explosion of data is revolutionizing business practices. The availability of new data sources and delivery models provides unprecedented insights into customer and partner behavior and enables much improved capacity to understand and optimize business processes and operations. Real time data allows companies to fine tune inventories and in-store product placement; it allows restaurants to know what a customer will order, even before they read the menu or reach the counter. And, data is also the foundation for new services offerings for companies like John Deere or BMW or Starwood.
Since Tibco acquired Jaspersoft on April 28th, 2014, I keep being asked the question: “Will this deal change the BI and analytics landscape?” (If you missed the announcement, here’s the press release.)
The short answer is: it could. The longer answer goes something like this: Jaspersoft and Tibco Spotfire complement each other nicely; Jaspersoft brings ETL and embedded BI to the table, whereas Spotfire has superior data analysis, discovery, and visualization capabilities. Jaspersoft’s open source business model provides Tibco with a different path to market, and Jaspersoft can benefit from Tibco’s corporate relationships and sales infrastructure. And with its utility-based cloud service, Jaspersoft also adds another option to Spotfire’s SaaS BI offering.
But that’s only the narrow view: once you take into consideration Tibco’s history (the hint’s in the name - “The Information Bus Company”) and the more recent string of acquisitions, a much larger potential story emerges. Starting with Spotfire in 2007, Tibco has assembled a powerful set of capabilities, including (but not limited to) analytics, data management, event processing, and related technologies such as customer loyalty management and mapping. If Tibco manages to leverage all of its assets in a way that provides enterprises with a flexible and agile integrated platform that helps them turn their data into actionable information, it will be a powerful new force that has the potential of changing enterprise BI platforms market.
To get there, Tibco has a number of challenges to address. On a tactical basis, it’s all about making the Jaspersoft acquisition work:
Retaining the talent
Making it easy for clients and prospects to engage with both companies
On April 23, IBM rolled out the long-awaited POWER8 CPU, the successor to POWER7+, and given the extensive pre-announcement speculation, the hardware itself was no big surprise (the details are fascinating, but not suitable for this venue), offering an estimated 30 - 50% improvement in application performance over the latest POWER7+, with potential for order of magnitude improvements with selected big data and analytics workloads. While the technology is interesting, we are pretty numb to the “bigger, better, faster” messaging that inevitably accompanies new hardware announcements, and the real impact of this announcement lies in its utility for current AIX users and IBM’s increased focus on Linux and its support of the OpenPOWER initiative.
OK, so we’re numb, but it’s still interesting. POWER8 is an entirely new processor generation implemented in 22 nm CMOS (the same geometry as Intel’s high-end CPUs). The processor features up to 12 cores, each with up to 8 threads, and a focus on not only throughput but high performance per thread and per core for low-thread-count applications. Added to the mix is up to 1 TB of memory per socket, massive PCIe 3 I/O connectivity and Coherent Accelerator Processor Interface (CAPI), IBM’s technology to deliver memory-controller-based access for accelerators and flash memory in POWER systems. CAPI figures prominently in IBM’s positioning of POWER as the ultimate analytics engine, with the announcement profiling the performance of a configuration using 40 TB of CAPI-attached flash for huge in-memory analytics at a fraction of the cost of a non-CAPI configuration.[i]
A Slam-dunk for AIX users and a new play for Linux
Most apps are dead boring. Sensors can help add some zing. Sensors are data collectors that measure physical properties of the real-world such as location, pressure, humidity, touch, voice, and much more. You can find sensors just about anywhere these days, most obviously in mobile devices that have accelerometers, GPS, microphones, and more. There is also the Internet of Things (IoT) that refers to the proliferation of Internet connected and accessible sensors expanding into every corner of humanity. But, most applications barely use them to the fullest extent possible. Data from sensors can help make your apps predictive to impress customers, make workers more efficient, and boost your career as an application developer.