“Business Intelligence in the cloud? You’ve got to be joking!” That’s the response I got when I recently asked a client whether they’d considered availing themselves of a software-as-a-service (SaaS) solution to meet a particular BI need. Well, I wasn’t joking. There are many scenarios when it makes sense to turn to the cloud for a BI solution, and increasing numbers of organizations are indeed doing so. Indications are also that companies are taking a pragmatic approach to cloud BI, headlines to the contrary notwithstanding. Forrester has found that:
· Less than one third of organizations have no plans for cloud BI. When we asked respondents in our Forrsights Software Survey Q4 2013 whether they were using SaaS BI in the cloud, or were intending to do so, not even one third declared that they had no plans. Of the rest, 34% were already using cloud BI, and 31% had cloud in their BI plans for the next two years. But it’s not a case of either/or: the majority of those who’ve either already adopted cloud BI or are intending to do so are using the SaaS system to complement their existing BI and analytics capabilities. Still, it’s worth noting that 12% of survey respondents had already replaced most or all or their existing BI systems with SaaS, and a further 16% were intending to do so.
An explosion of data is revolutionizing business practices. The availability of new data sources and delivery models provides unprecedented insights into customer and partner behavior and enables much improved capacity to understand and optimize business processes and operations. Real time data allows companies to fine tune inventories and in-store product placement; it allows restaurants to know what a customer will order, even before they read the menu or reach the counter. And, data is also the foundation for new services offerings for companies like John Deere or BMW or Starwood.
Since Tibco acquired Jaspersoft on April 28th, 2014, I keep being asked the question: “Will this deal change the BI and analytics landscape?” (If you missed the announcement, here’s the press release.)
The short answer is: it could. The longer answer goes something like this: Jaspersoft and Tibco Spotfire complement each other nicely; Jaspersoft brings ETL and embedded BI to the table, whereas Spotfire has superior data analysis, discovery, and visualization capabilities. Jaspersoft’s open source business model provides Tibco with a different path to market, and Jaspersoft can benefit from Tibco’s corporate relationships and sales infrastructure. And with its utility-based cloud service, Jaspersoft also adds another option to Spotfire’s SaaS BI offering.
But that’s only the narrow view: once you take into consideration Tibco’s history (the hint’s in the name - “The Information Bus Company”) and the more recent string of acquisitions, a much larger potential story emerges. Starting with Spotfire in 2007, Tibco has assembled a powerful set of capabilities, including (but not limited to) analytics, data management, event processing, and related technologies such as customer loyalty management and mapping. If Tibco manages to leverage all of its assets in a way that provides enterprises with a flexible and agile integrated platform that helps them turn their data into actionable information, it will be a powerful new force that has the potential of changing enterprise BI platforms market.
To get there, Tibco has a number of challenges to address. On a tactical basis, it’s all about making the Jaspersoft acquisition work:
Retaining the talent
Making it easy for clients and prospects to engage with both companies
On April 23, IBM rolled out the long-awaited POWER8 CPU, the successor to POWER7+, and given the extensive pre-announcement speculation, the hardware itself was no big surprise (the details are fascinating, but not suitable for this venue), offering an estimated 30 - 50% improvement in application performance over the latest POWER7+, with potential for order of magnitude improvements with selected big data and analytics workloads. While the technology is interesting, we are pretty numb to the “bigger, better, faster” messaging that inevitably accompanies new hardware announcements, and the real impact of this announcement lies in its utility for current AIX users and IBM’s increased focus on Linux and its support of the OpenPOWER initiative.
OK, so we’re numb, but it’s still interesting. POWER8 is an entirely new processor generation implemented in 22 nm CMOS (the same geometry as Intel’s high-end CPUs). The processor features up to 12 cores, each with up to 8 threads, and a focus on not only throughput but high performance per thread and per core for low-thread-count applications. Added to the mix is up to 1 TB of memory per socket, massive PCIe 3 I/O connectivity and Coherent Accelerator Processor Interface (CAPI), IBM’s technology to deliver memory-controller-based access for accelerators and flash memory in POWER systems. CAPI figures prominently in IBM’s positioning of POWER as the ultimate analytics engine, with the announcement profiling the performance of a configuration using 40 TB of CAPI-attached flash for huge in-memory analytics at a fraction of the cost of a non-CAPI configuration.[i]
A Slam-dunk for AIX users and a new play for Linux
Management consultants and business intelligence, analytics and big data system integrations often use the terms accelerators, blueprints, solutions, frameworks, and products to show off their industry and business domain (sales, marketing, finance, HR, etc) expertise, experience and specialization. Unfortunately, they often use these terms synonymously, while in pragmatic reality meanings vary quite widely. Here’s our pragmatic take on the tangible reality behind the terms (in the increasing order of comprehensiveness):
Fameworks. Often little more than a collection of best practices and lessons learned from multiple client engagements. These can sometimes shave off 5%-10% of a project time/effort mainly by enabling buyers to learn from the mistakes others already made and not repeating them.
Solution Accelerators. Aka Blueprints, these are usually a collection of deliverables, content and other artifacts from prior client engagements. Such artifacts could be in the form of data connectors, transformation logic, data models, metrics, reports and dashboards, but they are often little more than existing deliverables that can be cut/pasted or otherwise leveraged in a new client engagement. Similar to Frameworks, Solution Accelerators often come with a set of best practices. Solution Accelerators can help you hit the ground running and rather than starting from scratch, find yourself 10%-20% into a project.
Solutions. A step above Solution Accelerators, Solutions prepackage artifacts from prior client engagements, by cleansing and stripping them of proprietary content and/or irrelevant info. Count on shaving 20% to 30% off the effort.
To jump on this R feeding frenzy most leading BI vendors claim that they “integrate with R”, but what does that claim really mean? Our take on this – not all BI/R integration is created equal. When evaluating BI platforms for R integration, Forrester recommends considering the following integration capabilities:
IBM recently kicked off its big data market planning for 2014 and released a white paper that discusses how analytics create new business value for end user organizations. The major differences compared with last year’s event:
Organizational change. IBM has assigned a new big data practice leader for China, similar to what it’s done for other new technologies including mobile, social, and cloud. IBM can integrate resources from infrastructure (IBM STG), software (IBM SWG), and services (IBM GBS/GTS) teams, although the team members do not report directly to them.
A new analytics platform powered by Watson technology. The Watson Foundation platform has three new functions. It can be deployed on SoftLayer; it extends IBM’s big data analysis capabilities to social, mobile, and cloud; and it offers enterprises the power and ease of use of Watson analysis.
Measurable benefits from customer insights analysis. Chinese organizations have started to buy into the value of analytics and would like to invest in technology tools to optimize customer insights. AmorePacific, a Hong Kong-based skin care and cosmetics company, is using IBM’s SPSS predictive analytics solution to craft tailored messages to its customers and has improved its response rate by more than 30%. It primarily analyzes point-of-sale data, demographic information from its loyalty program, and market data such as property values in the neighborhoods where customers live.
Coming back from the SAS Industry Analyst Event left me with one big question - Are we taking into account the recommendations or insights provided through analysis and see if they actually produced positive or negative results?
It's a big question for data governance that I'm not hearing discussed around the table. We often emphsize how data is supplied, but how it performs in it's consumed state is fogotten.
When leading business intelligence and analytics teams I always pushed to create reports and analysis that ultimately incented action. What you know should influence behavior and decisions, even if the influence was to say, "Don't change, keep up the good work!" This should be a fundamental function of data govenance. We need to care not only that the data is in the right form factor but also review what the data tells us/or how we interpret the data and did it make us better?
I've talked about the closed-loop from a master data management perspective - what you learn about customers will alter and enrich the customer master. The connection to data governance is pretty clear in this case. However, we shouldn't stop at raw data and master definitions. Our attention needs to include the data business users receive and if it is trusted and accurate. This goes back to the fact that how the business defines data is more than what exists in a database or application. Data is a total, a percentage, an index. This derived data is what the business expects to govern - and if derived data isn't supporting business objectives, that has to be incorporated into the data governance discussion.
This week, IBM announced its new line of x86 servers, and included among the usual incremental product improvements is a performance game-changer called eXFlash. eXFlash is the first commercially available implantation of the MCS architecture announced last year by Diablo Technologies. The MCS architecture, and IBM’s eXFlash offering in particular, allows flash memory to be embedded on the system as close to the CPU as main memory, with latencies substantially lower than any other available flash options, offering better performance at a lower solution cost than other embedded flash solutions. Key aspects of the announcement include:
■ Flash DIMMs offer scalable high performance. Write latency (a critical metric) for IBM eXFlash will be in the 5 to 10 microsecond range, whereas best-of-breed competing mezzanine card and PCIe flash can only offer 15 to 20 microseconds (and external flash storage is slower still). Additionally, since the DIMMs are directly attached to the memory controller, flash I/O does not compete with other I/O on the system I/O hub and PCIe subsystem, improving overall system performance for heavily-loaded systems. Additional benefits include linear performance scalability as the number of DIMMs increase and optional built-in hardware mirroring of DIMM pairs.
■ eXFlash DIMMs are compatible with current software. Part of the magic of MCS flash is that it appears to the OS as a standard block-mode device, so all existing block-mode software will work, including applications, caching and tiering or general storage management software. For IBM users, compatibility with IBM’s storage management and FlashCache Storage Accelerator solutions is guaranteed. Other vendors will face zero to low effort in qualifying their solutions.