The Background – Linux as a Fast Follower and the Need for Hot Patching
No doubt about it, Linux has made impressive strides in the last 15 years, gaining many features previously associated with high-end proprietary Unix as it made the transition from small system plaything to core enterprise processing resource and the engine of the extended web as we know it. Along the way it gained reliable and highly scalable schedulers, a multiplicity of efficient and scalable file systems, advanced RAS features, its own embedded virtualization and efficient thread support.
As Linux grew, so did supporting hardware, particularly the capabilities of the ubiquitous x86 CPU upon which the vast majority of Linux runs today. But the debate has always been about how close Linux could get to “the real OS”, the core proprietary Unix variants that for two decades defined the limits of non-mainframe scalability and reliability. But “the times they are a changing”, and the new narrative may be “when will Unix catch up to Linux on critical RAS features like hot patching”.
Hot patching, the ability to apply updates to the OS kernel while it is running, is a long sought-after but elusive feature of a production OS. Long sought after because both developers and operations teams recognize that bringing down an OS instance that is doing critical high-volume work is at best disruptive and worst a logistical nightmare, and elusive because it is incredibly difficult. There have been several failed attempts, and several implementations that “almost worked” but were so fraught with exceptions that they were not really useful in production.[i]
As a customer insights / analytics / digital measurement pro, do you experience any of these challenges? And what can you do right now to make progress with them?
I can’t keep up with requests from my stakeholders for analysis and insights. Does the volume of requests and your team’s capacity seem increasingly out of whack in your organization?
Our customer data isn’t where we need it to be – we can’t get a comprehensive view of our customer. You’re not alone. Marketing and technology teams struggle to align objectives, roles, budget, projects and process, and timelines to maximize value from customer data. Marketing decision-makers report several reasons they are failing: too many data sources (44%), lack of access to technology to manage data source integration (38%), lack of budget (35%), lack of skills to support integration (34%), organizational silos (27%), and lack of an executive sponsor (23%).
We’re leaving money on the table because our different analytics and insights teams work in silos. Here’s a simple digital measurement example of this: one digital team is responsible for driving visits to the website. Other teams are responsible for maximizing on-site conversions. They work in their own separate silos. A more efficient and effective approach: work together to identify the characteristics of customers most likely to convert, and work on driving that group to the site. That type of silo breakdown needs to happen more.
Delivering broad access to data and analytics to a diverse base of users is an intimidating task, yet it is an essential foundation to becoming an insights-driven organization. To win and keep customers in an increasingly competitive world, firms need to take advantage of the huge swaths of data available and put it into the hands of more users. To do this, business intelligence (BI) pros must evolve disjointed and convoluted data and analytics practices into well-orchestrated systems of insight that deliver actionable information. But implementing digital insights is just the first step with these systems — and few hit the bull's eye the first time. Continuously learning from previous insights and their results makes future efforts more efficient and effective. This is a key capability for the next-generation BI, what Forrester calls systems of insight.
"It's 10 o'clock! Do you know if your insights support actual verifiable facts?" This is a real challenge, as measuring report and dashboard effectiveness today involves mostly discipline and processes, not technology. For example, if a data mining analysis predicted a certain number of fraudulent transactions, do you have the discipline and processes to go back and verify whether the prediction came true? Or if a metrics dashboard was flashing red, telling you that inventory levels were too low for the current business environment, and the signal caused you to order more widgets, do you verify if this was a good or a bad decision? Did you make or lose money on the extra inventory you ordered? Organizations are still struggling with this ultimate measure of BI effectiveness. Only 8% of Forrester clients report robust capabilities for such continuous improvement, and 39% report just a few basic capabilities.
From discussions with our clients in the financial services industry (FSI) in Asia Pacific, we’ve noticed that their digital agenda has changed dramatically over the past 18 months, shifting from a consideration of acquisitions and distribution channels to a broader business transformation imperative.
In fact, leaders at banks and insurance firms are increasingly realizing that:
Customer experience is fast becoming the only competitive differentiator.
Banks and insurance have to accelerate their ability to innovate and deliver new sources of value to customers faster.
Yes, I think someone’s banging on the door. Pretty hard actually.
In fact, it’s deafening.
The knocking is empowered digital media buyers. The slowness to answer is the media ecosystem of publishers, media agencies, and broadcasters.
I shared the video below a week ago on LinkedIn and people clearly like it. It’s the parable I just stated, but acted out. Listen to Gabe Leydon of Machine Zone (big digital media buyer) slam the media ecosystem. It’s painful. Cathartic. Iconoclastic. Focus on two segments: 11:00 -> 11:45 and 12:55 -> 13:55.
This is the advertising ecosystem’s reckoning with the age of the customer. The customers want to cut through all of the layers of BS that advertising has traditionally wrapped itself up in.
I had a few takeaways given Leydon’s analysis:
Media businesses are trying to be technology platforms, but are mostly houses on fire.
Analytics agencies are the new media agencies.
Media agencies are just houses on fire.
If you’re a marketer, pull your media-buying capabilities close to your chest. Invest in better analytics. And do everything in your power to get a measurable, direct-to-consumer sales channel on its feet, if only to provide insights to the marketing that feeds your indirect channels.
We've seen another acquisition in the shifting eDiscovery market this week as kCura, the developer of Relativity, announced its acquisition of Content Analyst Company, the brains behind the CAAT analytics engine (kCura’s press release is here). The acquisition is not entirely surprising. kCura has been relying on the CAAT engine to power its analytics offering for eight years. According to kCura, use of its Relativity Analytics offering “has grown by nearly 1,500 percent” since 2011, with more than 70% of current kCura’s customers with licenses.
What does this acquisition mean for kCura, its customers, and Content Analyst Company customers?
One of the reasons for only a portion of enterprise and external (about a third of structured and a quarter of unstructured -) data being available for insights is a restrictive architecture of SQL databases. In SQL databases data and metadata (data models, aka schemas) are tightly bound and inseparable (aka early binding, schema on write). Changing the model often requires at best just rebuilding an index or an aggregate, at worst - reloading entire columns and tables. Therefore many analysts start their work from data sets based on these tightly bound models, where DBAs and data architects have already built business requirements (that may be outdated or incomplete) into the models. Thus the data delivered to the end-users already contains inherent biases, which are opaque to the user and can strongly influence their analysis. As part of the natural evolution of Business Intelligence (BI) platforms data exploration now addresses this challenge. How? BI pros can now take advantage of ALL raw data available in their enterprises by:
I’ll be brief, because I know you’re busy. If you’re a customer-obsessed marketer, you should plan to attend Forrester’s annual Forum designed just for you – MARKETING 2016. Join us and 600+ marketing leaders in New York City on April 26-27 as we dive deep into the issues that matter most to you. Our agenda this year is comprised of five sections:
1. Thriving In The Post-Digital Age: Led by our own VP/Principal Analyst Shar VanBoskirk, hear our latest research on what it takes to succeed as a marketing leader in a world where digital embeds in everything.
2. Customer Understanding: Sick of all the noise about big data? Join VP and Research Director Srividya Sridharan as she uncovers how to move from data, to insights, to business action.
3. Contextual Engagement: Principal Analyst Rusty Warner will be joined onstage by eBay and J&J as they discuss best practices in using situational context to drive deeper customer engagement.
4. The Leadership Question: Forrester’s Michelle Moorehead will moderate a superstar panel on the changing leadership role for CMOs.
5. The Power Of Trust: Principal Analyst Fatemeh Khatibloo will discuss how your ability to manage consumer privacy will be a key differentiator in building trust.
When I read articles like today's WSJ article on mutual funds exiting high tech startups and triangulate the content with Forrester client interactions over the last 12 to 18 months (and some rumors) I am now becoming convinced that there will be some Business Intelligence (BI) and analytics vendor shake ups in 2016. Even though according to our research enterprises are still only leveraging 20%-40% of their entire universe of data for insights and decisions, and 50%-80% of all BI/analytics apps are still done in spreadsheets, the market is over saturated with vendors. Just take a look at the 50+ vendors we track in our BI Vendor Landscape. IMHO we are nearing a saturating point where the buy side of the market cannot sustain so many sellers. Indeed we are already seeing a trend where large enterprises, which a couple of years ago had 10+ different BI platforms, today usually only deploy somewhere between 3 and 5. And, in case you missed it, we already saw what is surely to be a much bigger trend of BI/analytics M&A - SAP acquiring mobile BI vendor Roambi. Start hedging your BI vendor bets!
Rule #1. Don't just jump into creating a hefty enterprise wide Business Intelligence (BI)
Business intelligence and its next iteration, systems of insight (SOI), have moved to the top of BI pros' agendas for enterprise software adoption. Investment in BI tools and applications can have a number of drivers, both external (such as regulatory requirements or technology obsolescence) and internal (such as the desire to improve processes or speed up decision-making). However, putting together a BI business case is not always a straightforward process. Before embarking on a BI business case endeavor, consider that:
You may not actually need a business case. Determining whether a BI business case is necessary includes three main considerations. Is it an investment that the organization must make to stay in business, should consider because other investments are changing the organization's IT landscape, or wants to make because of expected business benefits?
A business sponsor does not obviate the need for a business case. It may be tempting to conclude that you can skip making a business case for BI whenever there is a strong push for investment from the business side, in particular when budget holders are prepared to commit money. Resist this impulse whenever possible: The resulting project will likely suffer from a lack of focus, and recriminations are likely to follow sooner or later.