What Does Acxiom's $310M LiveRamp Bid Mean For Marketers?

Tina Moffett

On May 14, Acxiom announced its intention to acquire LiveRamp, a "data onboarding service," to the tune of $310 million in cash. Several Forrester analysts (Fatemeh Khatibloo, Susan BidelSri Sridharan, and I) cover these two firms, and what follows is our collective thinking on the impending acquisition after having been briefed by Acxiom's leadership on the matter.

Read more

Cloud Is Becoming A Key Feature Of The BI And Analytics Landscape

Martha Bennett

“Business Intelligence in the cloud? You’ve got to be joking!” That’s the response I got when I recently asked a client whether they’d considered availing themselves of a software-as-a-service (SaaS) solution to meet a particular BI need. Well, I wasn’t joking. There are many scenarios when it makes sense to turn to the cloud for a BI solution, and increasing numbers of organizations are indeed doing so. Indications are also that companies are taking a pragmatic approach to cloud BI, headlines to the contrary notwithstanding. Forrester has found that:

·         Less than one third of organizations have no plans for cloud BI. When we asked respondents in our Forrsights Software Survey Q4 2013 whether they were using SaaS BI in the cloud, or were intending to do so, not even one third declared that they had no plans. Of the rest, 34% were already using cloud BI, and 31% had cloud in their BI plans for the next two years.  But it’s not a case of either/or: the majority of those who’ve either already adopted cloud BI or are intending to do so are using the SaaS system to complement their existing BI and analytics capabilities. Still, it’s worth noting that 12% of survey respondents had already replaced most or all or their existing BI systems with SaaS, and a further 16% were intending to do so.

Read more

How Will The Data Economy Impact Enterprise Architects?

Gene Leganza
No self-respecting EA professional would enter into planning discussions with business or tech management execs without a solid grasp of the technologies available to the enterprise, right? But what about the data available to the enterprise? Given the shift towards data-driven decision-making and the clear advantages from advanced analytics capabilities, architecture professionals should be coming to the planning table with not only an understanding of enterprise data, but a working knowledge of the available third-party data that could have significant impact on your approach to customer engagement or your B2B partner strategy.
 
 
Data discussions can't be simply about internal information flow, master data, and business glossaries any more. Enterprise architects, business architects, and information architects working with business execs on tech-enabled strategies need to bring third-party data know-how to their brainstorming and planning discussions. As the data economy is still in its relatively early stages and, more to the point, as organizational responsibilities for sourcing, managing, and governing third-party data are still in their formative states, it behooves architects to take the lead in understanding the data economy in some detail. By doing so, architects can help their organizations find innovative approaches to data and analytics that have direct business impact by improving the customer experience, making your partner ecosystem more effective, or finding new revenue from data-driven products.
 
Read more

Double Down On Data: Who Ya Gonna Call?

Jennifer Belissent, Ph.D.

An explosion of data is revolutionizing business practices. The availability of new data sources and delivery models provides unprecedented insights into customer and partner behavior and enables much improved capacity to understand and optimize business processes and operations. Real time data allows companies to fine tune inventories and in-store product placement; it allows restaurants to know what a customer will order, even before they read the menu or reach the counter. And, data is also the foundation for new services offerings for companies like John Deere or BMW or Starwood.

Read more

Tibco Buys Jaspersoft: A Deal With Transformative Potential

Martha Bennett

Since Tibco acquired Jaspersoft on April 28th, 2014, I keep being asked the question: “Will this deal change the BI and analytics landscape?” (If you missed the announcement, here’s the press release.)

The short answer is: it could. The longer answer goes something like this: Jaspersoft and Tibco Spotfire complement each other nicely; Jaspersoft brings ETL and embedded BI to the table, whereas Spotfire has superior data analysis, discovery, and visualization capabilities. Jaspersoft’s open source business model provides Tibco with a different path to market, and Jaspersoft can benefit from Tibco’s corporate relationships and sales infrastructure. And with its utility-based cloud service, Jaspersoft also adds another option to Spotfire’s SaaS BI offering.    

But that’s only the narrow view: once you take into consideration Tibco’s history (the hint’s in the name - “The Information Bus Company”) and the more recent string of acquisitions, a much larger potential story emerges. Starting with Spotfire in 2007, Tibco has assembled a powerful set of capabilities, including (but not limited to) analytics, data management, event processing, and related technologies such as customer loyalty management and mapping. If Tibco manages to leverage all of its assets in a way that provides enterprises with a flexible and agile integrated platform that helps them turn their data into actionable information, it will be a powerful new force that has the potential of changing enterprise BI platforms market.

To get there, Tibco has a number of challenges to address. On a tactical basis, it’s all about making the Jaspersoft acquisition work:

  • Retaining the talent
  • Making it easy for clients and prospects to engage with both companies
Read more

IBM Announces Next Generation POWER Systems – Big Win for AIX Users, New Option for Linux

Richard Fichera

On April 23, IBM rolled out the long-awaited POWER8 CPU, the successor to POWER7+, and given the extensive pre-announcement speculation, the hardware itself was no big surprise (the details are fascinating, but not suitable for this venue), offering an estimated  30 - 50% improvement in application performance over the latest POWER7+, with potential for order of magnitude improvements with selected big data and analytics workloads. While the technology is interesting, we are pretty numb to the “bigger, better, faster” messaging that inevitably accompanies new hardware announcements, and the real impact of this announcement lies in its utility for current AIX users and IBM’s increased focus on Linux and its support of the OpenPOWER initiative.

Technology

OK, so we’re numb, but it’s still interesting. POWER8 is an entirely new processor generation implemented in 22 nm CMOS (the same geometry as Intel’s high-end CPUs). The processor features up to 12 cores, each with up to 8 threads, and a focus on not only throughput but high performance per thread and per core for low-thread-count applications. Added to the mix is up to 1 TB of memory per socket, massive PCIe 3 I/O connectivity and Coherent Accelerator Processor Interface (CAPI), IBM’s technology to deliver memory-controller-based access for accelerators and flash memory in POWER systems. CAPI figures prominently in IBM’s positioning of POWER as the ultimate analytics engine, with the announcement profiling the performance of a configuration using 40 TB of CAPI-attached flash for huge in-memory analytics at a fraction of the cost of a non-CAPI configuration.[i]

A Slam-dunk for AIX users and a new play for Linux

Read more

Apps Are Blind — Use Sensors To Make Them See

Mike Gualtieri

Most apps are dead boring. Sensors can help add some zing. Sensors are data collectors that measure physical properties of the real-world such as location, pressure, humidity, touch, voice, and much more. You can find sensors just about anywhere these days, most obviously in mobile devices that have accelerometers, GPS, microphones, and more. There is also the Internet of Things (IoT) that refers to the proliferation of Internet connected and accessible sensors expanding into every corner of humanity. But, most applications barely use them to the fullest extent possible. Data from sensors can help make your apps predictive to impress customers, make workers more efficient, and boost your career as an application developer.

Read more

Future-Proof Your Customer Insights Practice with Adaptive Intelligence

Fatemeh Khatibloo

We've been talking about Adaptive Intelligence (AI) for a while now. As a refresher, AI is is the real-time, multidirectional sharing of data to derive contextually appropriate, authoritative knowledge that helps maximize business value.  

Increasingly in inquiries, workshops, FLB sessions, and advisories, we hear from our customer insights (CI) clients that developing the capabilities required for adaptive intelligence would actually help them solve a lot of other problems, too. For example:

  • A systematic data innovation approach encourages knowledge sharing throughout the organization, reduces data acquisition redundancies, and brings energy and creativity to the CI practice.
  • A good handle on data origin kickstarts your marketing organization's big data process by providing a well-audited foundation to build upon.
  • Better data governance and data controls improve your privacy and security practices by ensuring cross-functional adoption of the same set of standards and processes.
  • Better data structure puts more data in the hands of analysts and decision-makers, in the moment and within the systems of need (eg, campaign management tools, content management systems, customer service portals, and more).
  • More data interoperability enables channel-agnostic customer recognition, and the ability to ingest novel forms of data -- like preference, wearables data, and many more -- that can vastly improve your ability to deliver great customer experiences.
Read more

What Do Business Intelligence Consultants Mean By “Solutions”?

Boris Evelson

Management consultants and business intelligence, analytics and big data system integrations often use the terms accelerators, blueprints, solutions, frameworks, and products to show off their industry and business domain (sales, marketing, finance, HR, etc) expertise, experience and specialization. Unfortunately, they often use these terms synonymously, while in pragmatic reality meanings vary quite widely. Here’s our pragmatic take on the tangible reality behind the terms (in the increasing order of comprehensiveness):

  • Fameworks. Often little more than a collection of best practices and lessons learned from multiple client engagements. These can sometimes shave off 5%-10% of a project time/effort mainly by enabling buyers to learn from the mistakes others already made and not repeating them.
  • Solution Accelerators. Aka Blueprints, these are usually a collection of deliverables, content and other artifacts from prior client engagements. Such artifacts could be in the form of data connectors, transformation logic, data models, metrics, reports and dashboards, but they are often little more than existing deliverables that can be cut/pasted or otherwise leveraged in a new client engagement. Similar to Frameworks, Solution Accelerators often come with a set of best practices. Solution Accelerators can help you hit the ground running and rather than starting from scratch, find yourself 10%-20% into a project.
  • Solutions. A step above Solution Accelerators, Solutions prepackage artifacts from prior client engagements, by cleansing and stripping them of proprietary content and/or irrelevant info. Count on shaving 20% to 30% off the effort.
Read more

To Name Your Price Is To Know Your Price: SmartProcure Brings Data Sharing To Public Procurement

Jennifer Belissent, Ph.D.

So you need some work done that you’ve never had done before or you need to buy something you’ve never bought before.  What should you pay?  That can be a tough question.  What seems reasonable? Sometimes we set arbitrary rules. It’s OK if it’s under $50 or under $100. But that’s just a reassurance that you’re not getting ripped off too badly. Certainly the best way to avoid that outcome is to know how much that service or thing is worth, or at least know what others have paid for the same thing.

Fortunately now, in the age of the customer, that’s easier to find out. Price information for most consumer goods is easier to come by, making the buying process more efficient. But what about governments? We’ve all heard about the $600 toilet seat or the $400 hammer. Stories of government spending excess and mismanagement abound. Some are urban legends or misrepresentations. Others have legs — such as the recent reports of Boeing overcharging the US Army. While these incidents are likely not things of the past, open data initiatives have made significant progress in exposing spending data and improving transparency. Citizens can visit sites such as USAspending.gov for US federal government spending or "Where Does My Money Go?" for details on UK national government spending, and most large cities publish spending as well.

Read more