An explosion of data is revolutionizing business practices. The availability of new data sources and delivery models provides unprecedented insights into customer and partner behavior and enables much improved capacity to understand and optimize business processes and operations. Real time data allows companies to fine tune inventories and in-store product placement; it allows restaurants to know what a customer will order, even before they read the menu or reach the counter. And, data is also the foundation for new services offerings for companies like John Deere or BMW or Starwood.
Since Tibco acquired Jaspersoft on April 28th, 2014, I keep being asked the question: “Will this deal change the BI and analytics landscape?” (If you missed the announcement, here’s the press release.)
The short answer is: it could. The longer answer goes something like this: Jaspersoft and Tibco Spotfire complement each other nicely; Jaspersoft brings ETL and embedded BI to the table, whereas Spotfire has superior data analysis, discovery, and visualization capabilities. Jaspersoft’s open source business model provides Tibco with a different path to market, and Jaspersoft can benefit from Tibco’s corporate relationships and sales infrastructure. And with its utility-based cloud service, Jaspersoft also adds another option to Spotfire’s SaaS BI offering.
But that’s only the narrow view: once you take into consideration Tibco’s history (the hint’s in the name - “The Information Bus Company”) and the more recent string of acquisitions, a much larger potential story emerges. Starting with Spotfire in 2007, Tibco has assembled a powerful set of capabilities, including (but not limited to) analytics, data management, event processing, and related technologies such as customer loyalty management and mapping. If Tibco manages to leverage all of its assets in a way that provides enterprises with a flexible and agile integrated platform that helps them turn their data into actionable information, it will be a powerful new force that has the potential of changing enterprise BI platforms market.
To get there, Tibco has a number of challenges to address. On a tactical basis, it’s all about making the Jaspersoft acquisition work:
Retaining the talent
Making it easy for clients and prospects to engage with both companies
On April 23, IBM rolled out the long-awaited POWER8 CPU, the successor to POWER7+, and given the extensive pre-announcement speculation, the hardware itself was no big surprise (the details are fascinating, but not suitable for this venue), offering an estimated 30 - 50% improvement in application performance over the latest POWER7+, with potential for order of magnitude improvements with selected big data and analytics workloads. While the technology is interesting, we are pretty numb to the “bigger, better, faster” messaging that inevitably accompanies new hardware announcements, and the real impact of this announcement lies in its utility for current AIX users and IBM’s increased focus on Linux and its support of the OpenPOWER initiative.
OK, so we’re numb, but it’s still interesting. POWER8 is an entirely new processor generation implemented in 22 nm CMOS (the same geometry as Intel’s high-end CPUs). The processor features up to 12 cores, each with up to 8 threads, and a focus on not only throughput but high performance per thread and per core for low-thread-count applications. Added to the mix is up to 1 TB of memory per socket, massive PCIe 3 I/O connectivity and Coherent Accelerator Processor Interface (CAPI), IBM’s technology to deliver memory-controller-based access for accelerators and flash memory in POWER systems. CAPI figures prominently in IBM’s positioning of POWER as the ultimate analytics engine, with the announcement profiling the performance of a configuration using 40 TB of CAPI-attached flash for huge in-memory analytics at a fraction of the cost of a non-CAPI configuration.[i]
A Slam-dunk for AIX users and a new play for Linux
Most apps are dead boring. Sensors can help add some zing. Sensors are data collectors that measure physical properties of the real-world such as location, pressure, humidity, touch, voice, and much more. You can find sensors just about anywhere these days, most obviously in mobile devices that have accelerometers, GPS, microphones, and more. There is also the Internet of Things (IoT) that refers to the proliferation of Internet connected and accessible sensors expanding into every corner of humanity. But, most applications barely use them to the fullest extent possible. Data from sensors can help make your apps predictive to impress customers, make workers more efficient, and boost your career as an application developer.
We've been talking about Adaptive Intelligence (AI) for a while now. As a refresher, AI is is the real-time, multidirectional sharing of data to derive contextually appropriate, authoritative knowledge that helps maximize business value.
Increasingly in inquiries, workshops, FLB sessions, and advisories, we hear from our customer insights (CI) clients that developing the capabilities required for adaptive intelligence would actually help them solve a lot of other problems, too. For example:
A systematic data innovation approach encourages knowledge sharing throughout the organization, reduces data acquisition redundancies, and brings energy and creativity to the CI practice.
A good handle on data origin kickstarts your marketing organization's big data process by providing a well-audited foundation to build upon.
Better data governance and data controls improve your privacy and security practices by ensuring cross-functional adoption of the same set of standards and processes.
Better data structure puts more data in the hands of analysts and decision-makers, in the moment and within the systems of need (eg, campaign management tools, content management systems, customer service portals, and more).
More data interoperability enables channel-agnostic customer recognition, and the ability to ingest novel forms of data -- like preference, wearables data, and many more -- that can vastly improve your ability to deliver great customer experiences.
Management consultants and business intelligence, analytics and big data system integrations often use the terms accelerators, blueprints, solutions, frameworks, and products to show off their industry and business domain (sales, marketing, finance, HR, etc) expertise, experience and specialization. Unfortunately, they often use these terms synonymously, while in pragmatic reality meanings vary quite widely. Here’s our pragmatic take on the tangible reality behind the terms (in the increasing order of comprehensiveness):
Fameworks. Often little more than a collection of best practices and lessons learned from multiple client engagements. These can sometimes shave off 5%-10% of a project time/effort mainly by enabling buyers to learn from the mistakes others already made and not repeating them.
Solution Accelerators. Aka Blueprints, these are usually a collection of deliverables, content and other artifacts from prior client engagements. Such artifacts could be in the form of data connectors, transformation logic, data models, metrics, reports and dashboards, but they are often little more than existing deliverables that can be cut/pasted or otherwise leveraged in a new client engagement. Similar to Frameworks, Solution Accelerators often come with a set of best practices. Solution Accelerators can help you hit the ground running and rather than starting from scratch, find yourself 10%-20% into a project.
Solutions. A step above Solution Accelerators, Solutions prepackage artifacts from prior client engagements, by cleansing and stripping them of proprietary content and/or irrelevant info. Count on shaving 20% to 30% off the effort.
So you need some work done that you’ve never had done before or you need to buy something you’ve never bought before. What should you pay? That can be a tough question. What seems reasonable? Sometimes we set arbitrary rules. It’s OK if it’s under $50 or under $100. But that’s just a reassurance that you’re not getting ripped off too badly. Certainly the best way to avoid that outcome is to know how much that service or thing is worth, or at least know what others have paid for the same thing.
Fortunately now, in the age of the customer, that’s easier to find out. Price information for most consumer goods is easier to come by, making the buying process more efficient. But what about governments? We’ve all heard about the $600 toilet seat or the $400 hammer. Stories of government spending excess and mismanagement abound. Some are urban legends or misrepresentations. Others have legs — such as the recent reports of Boeing overcharging the US Army. While these incidents are likely not things of the past, open data initiatives have made significant progress in exposing spending data and improving transparency. Citizens can visit sites such as USAspending.gov for US federal government spending or "Where Does My Money Go?" for details on UK national government spending, and most large cities publish spending as well.
To jump on this R feeding frenzy most leading BI vendors claim that they “integrate with R”, but what does that claim really mean? Our take on this – not all BI/R integration is created equal. When evaluating BI platforms for R integration, Forrester recommends considering the following integration capabilities:
Usually when a product or service shouts about its low pricing, that’s a bad thing but in Google’s case there’s unique value in its Sustained-use Discounts program which just might make it worth your consideration.
A journalist called and asked me today about the market size for wearables. I replied, “That’s not the big story.”
So what is? It's data, and what you can do with it.
First you have to collect the data and have the permission to do so. Most of these relationships are one-to-one. I have these relationships with Nike, Jawbone, Basis, RunKeeper, MyFitnessPal and a few others. I have an app for each on my phone that harvests the data and shows it to me in a way I can understand. Many of these devices have open APIs, so I can import my Fitbit or Jawbone data into MyFitnessPal, for example.
From the story on 9to5mac.com, it is clear that Apple (like with Passbook) is creating a single place for consumers to store a wide range of healthcare and fitness information. From the screenshots they have, it also appears that one can trend this information over time. The phone is capable of collecting some of this information, and is increasingly doing so with less battery burn due to efficiencies in how the sensor data is crunched, so to speak. Wearables – perhaps one from Apple – will collect more information. Other data will certainly come from third-party wearables - such as fitness wearables, patches, bandages, socks and shirt - and attachments, such as the Smartphone Physical. There will always be tradeoffs between the amount of information you collect and the form factor. While I don't want to wear a chubby, clunky device 24x7, it gets better every day.