Razor, Blades, and 10x: Key to Value-Based Pricing in the Data Warehousing Appliance Market

Price-performance is everything in data warehousing (DW), and it’s become the leading battleground for competitive differentiation.

As I noted in a blog post last month, the price of a fully configured DW appliance solution has dropped by an order of magnitude over the past 2-3 years, and it’s likely to continue declining. In 2010, many DW vendors will lower the price of their basic appliance products to less than $20,000 per usable terabyte (TB), which constitutes the new industry threshold pioneered by Oracle, Netezza, and other leading DW vendors.

But that’s just a metric of price, not price-performance. Ideally, each DW appliance vendor should be able to provide you with a metric that tells you exactly how much performance “bang” you’re getting for all those bucks. In a perfect world, all vendors would use the same price-performance metric and you would be able to compare their solutions side by side.

But, as I noted a year ago in another blog post, truly comparable cross-vendor DW benchmarks have never existed and are unlikely to emerge in today’s intensively competitive arena. No two DW vendors provide performance numbers that are based on the same configurations, workloads, and benchmark metrics. And considering how sensitive these performance claims are to so many variables in the vendors’ solutions and in customers’ production environments, it can be quite difficult to verify every vendor performance claim in your specific environment.

Read more

Self-Service Business Intelligence: Dissolving the Barriers to Creative Decision-Support Solutions

Self-service is all the rage in the world of business intelligence (BI), but it’s no fad. In fact, it’s the only way to make BI more pervasive, delivering insights into every decision—important or mundane—that drives your business. It’s the key to empowering users with actionable insights while removing many mundane BI development and maintenance tasks from IT’s crushing workload.

In mid- 2009, I published a Forrester report describing key benefits, use cases, and approaches for implementing self-service BI, under the broad heading of “mighty mashup.” Forrester customers have responded very favorably to the discussion, asking for advice on whether, when, and how they should adopt this approach. Going forward, Forrester will deepen our discussion of self-service as a best practice to be incorporated into enterprise BI Solution Center (BISC) teachings.

Read more

Process Mining: Because Your Company’s Workflow Issues Aren’t Always Obvious

Business processes can be incredibly hard to fathom. The more complex they are, the more difficult it is to find the magic blend of tasks, roles, flows, and other factors that distinguish a well-tuned process from a miserable flop. Even the people who’ve been part of the process for years may have little clue. It’s not just that they refuse to look beyond their job-specific perspectives, for fear of jeopardizing their careers. It’s often an issue of them being too close to the problem to see it clearly, even if they try very hard.

Process analytics is all about identifying what works and doesn’t work. It’s a key focus for us here at Forrester, and I’m collaborating with one of our leading business process management (BPM) analysts, Clay Richardson, on research into this important topic. The first order of business for us is to identify the full range of enabling infrastructure and tools for tracking, exploring, and analyzing a wide range of workflows. It’s clear that this must include, at the very least, business activity monitoring (BAM) tools, which roll up key process metrics into visual business intelligence (BI)-style dashboards for operational process managers. Likewise, historical process metrics should be available to the business analysts who design and optimize workflows. And each user should have access to whatever current key performance indicators are relevant to the roles they perform within one or more processes.

Read more

Social Network Analysis: The Fuse Igniting Enterprise Data Warehouse Growth. It’s Planet Petabyte or Bust!

Social networks have always been with us, of course, but now they’ve gained concrete reality in the online fabric of modern life.

Social network analysis has, in a real sense, been with us almost as long as we’ve been doing predictive analytics. Customer churn analysis is the killer app for predictive analytics, and it is inherently social. It’s long been known that individual customers don’t always churn themselves—i.e., decide to renew and/or bolt to the competition—in isolation. As they run the continual calculus called loyalty in their heads and hearts, they’re receiving fresh feeds of opinion from their friends and families, following the leads of peers and influencers, and keeping their fingers to the cultural breeze. You could also make a strong case for social networking—i.e., individual behaviors spurred, shaped, and encouraged within communities—as a key independent variable driving cross-sell, up-sell, fraud, and other phenomena for which we’ve long built predictive models.

The other day, a Forrester client was asking me for educated guesses on how fast the average enterprise data warehouse (EDW) is likely to grow over the next several years, and as I was working through the analysis, I couldn’t avoid the conclusion that social network analysis—for predictive and other uses—will be an important growth driver (though not the entire story). I’d like to lay out my key points.

Read more

Conversation as a Complex Event

Ah, memories. I remember the late, great Eighties, early in my analyst career, when I had my first brush with what was later known as “groupware.” It was a LAN-based package, “The Coordinator,” from Action Technologies. The architecture of the software wasn’t as important as the linguistic theory on which it was built: the notion that groups cultivate intelligence by structuring their internal conversations to achieve common goals.

Essentially, the package required people to tag every e-mail they sent based on whether it constituted a discussion of possibilities, a request for clarification, or a request for action—and it tracked these threads so that everybody knew the goal-oriented status of every conversation. As you can probably guess, this was a heavy-handed way of getting people to come to agreement. Software shouldn’t dictate how people choose to interact: real-world conversation’s far too complex and convoluted for that. Most people don’t like being forced to rephrase or reconceptualize how they communicate with others. In fact, most of us users simply defaulted to sending messages that discussed open-ended possibilities, rather than engage in a fussy protocol of formal requests and offers.

Read more

The New Decade of Advanced Analytics: Roll Over Rocket Scientists!

Crafting a truly comprehensive analytics environment is a bit like staring deeply into the night sky. When you try to absorb the billions of celestial objects out there—all of them at different ages and stages in their respective life cycles--you risk driving yourself insane. Your complex field of view contains the deep past, present, and future in one glorious, glowing glimpse.

Increasingly, the complex event processing (CEP) market, as a segment of the analytics arena, suffers from this “all too much” problem. This is not a slap against the technology itself, which is mature, or the growing list of CEP vendors, who offer many sophisticated solutions. Indeed, many CEP vendors now offer tools for viewing the deep present, consisting of myriad streams of real-time events, and the deep past, in the form of access to historical information pulled from many data warehouses, marts, and other repositories. And some—most notably, IBM with its InfoSphere Streams technology--now support visualization of the deep future, through its ability to apply predictive models to real-time event streams.

The core problem with today’s CEP offerings is that many of them are power tools, not solutions suitable for the mass business market. This same problem confronts established vendors of predictive analytics and data mining (PA/DM) tools, whose core user base consists of statisticians, mathematicians, and other highly educated analytics professionals. No one denies that traditional CEP and PA/DM tools are the analytical bedrock of mission-critical applications in diverse industries. But I challenge you to point to a single case study where they are used directly by the CEO, senior executives, or any other casual user, rather than indirectly through being embedded in some custom application.

Read more

Whither Data Warehousing In The Teens?

People often use the end of a decade to say goodbye to trends that have played themselves out, or good riddance to things that have long since passed their cultural expiration dates. I like to use the beginnings of decades for that same purpose. What, we should ask ourselves, is not likely to last beyond the close of this new ten-year cycle?

In data warehousing, the most likely casualty of the Teens will be the very notion of a data warehouse. You can tell that a concept is on its last legs when its proponents spend more time on the defense, fighting definitional trench wars, than evolving it in useful new directions. Here’s a perfect case in point:a recent article by Bill Inmon, self-described “father of the data warehouse,” where he takes pains to specify what is not a data warehouse. Apparently, many of the approaches that we normally implement in our data warehousing architectures—such as subject-specific data marts, dimensional data structures, federated architectures, and real-time data integration—don’t pass muster in Inmon’s way of looking at things. Though he didn’t mention hybrid row-columnar and in-memory databases by name, one suspects that Inmon has a similarly jaundiced view of these leading-edge data warehousing technologies.

Read more

Podcast: Service Oriented Analytics - Tapping Into The Predictive Smarts Of Your Entire Organization

Our latest featured podcast is Jim Kobielus' "Service Oriented Analytics - Tapping Into The Predictive Smarts Of Your Entire Organization".

In this podcast, BP&A Senior Analyst Jim Kobielus will define SOA, talk about best practices around its implementation, and discuss how BPA pros can achieve a greater SOA focus.

 

 

 

 

 

We look forward to your questions and comments.

 

---

Subscribe to Business Process & Applications podcasts through iTunes.

Subscribe through RSS.

 

 

Advanced Analytics Predictions For 2010

James G. Kobielus By James Kobielus

As we bid adieu to one decade and move into the next, it’s important to catch our collective breath and to take a quick look ahead. Here are some quick thoughts on the trends that will shape advanced analytics in the year to come. These trends will set the stage for thoroughgoing transformation of business intelligence (BI), data warehousing (DW), predictive analytics (PA), data mining (DM), business activity monitoring (BAM), complex event processing (CEP), and other key analytics technologies in the Teens:

  • Self-service operational BI puts information workers in driver’s seat: Enterprises have begun to adopt self-service BI to cut costs, unclog the analytics development backlog, and improve the velocity of practical insights. Users are demanding tools to do interactive, deeply dimensional exploration of information pulled from enterprise data warehouses, data marts, transactional applications, and other systems.
Read more

Podcast: Instrumenting Your Enterprise For Maximum Predictive Power

Our latest featured podcast is Jim Kobielus'"Instrumenting Your Enterprise For Maximum Predictive Power".

In this podcast, BP&A Senior Analyst Jim Kobielus discusses how best to leverage your company’s predictive investments. He also lays out a high level framework to assess your predictive analytics maturity.







We look forward to your questions and comments.



---

Read more