On January 4, 2010, Oracle announced its acquisition of Silver Creek Systems, a small private software company focusing on product data quality, which Oracle plans to add to its Oracle Data Integration offering. In our recent research, “It’s Time To Revisit Product Information Management”, we discussed how Forrester believes Silver Creek holds a virtual monopoly in delivering advanced product data quality capabilities, unmatched by other customer data-centric data quality vendors in the market. Due to this, many MDM, PIM and data quality software vendors, including Oracle, had relied on Silver Creek as a strategic partner to add credibility in product data quality. And as we accurately predicted in that research, Silver Creek has now been acquired which will introduce a significant challenge to these partners.
Crafting a truly comprehensive analytics environment is a bit like staring deeply into the night sky. When you try to absorb the billions of celestial objects out there—all of them at different ages and stages in their respective life cycles--you risk driving yourself insane. Your complex field of view contains the deep past, present, and future in one glorious, glowing glimpse.
Increasingly, the complex event processing (CEP) market, as a segment of the analytics arena, suffers from this “all too much” problem. This is not a slap against the technology itself, which is mature, or the growing list of CEP vendors, who offer many sophisticated solutions. Indeed, many CEP vendors now offer tools for viewing the deep present, consisting of myriad streams of real-time events, and the deep past, in the form of access to historical information pulled from many data warehouses, marts, and other repositories. And some—most notably, IBM with its InfoSphere Streams technology--now support visualization of the deep future, through its ability to apply predictive models to real-time event streams.
The core problem with today’s CEP offerings is that many of them are power tools, not solutions suitable for the mass business market. This same problem confronts established vendors of predictive analytics and data mining (PA/DM) tools, whose core user base consists of statisticians, mathematicians, and other highly educated analytics professionals. No one denies that traditional CEP and PA/DM tools are the analytical bedrock of mission-critical applications in diverse industries. But I challenge you to point to a single case study where they are used directly by the CEO, senior executives, or any other casual user, rather than indirectly through being embedded in some custom application.
Most people know Intel as a provider of microprocessor for large manufacturers like Apple, Dell and HP. A large portion of their business is driven by the elaborate network of customers - resellers, partners, etc.. from around the globe. To remain innovative Intel must enroll, engage, and entertain the most brilliant minds to continue to push the boundaries of technology. They realized the ability to collect and harness the power of that collective wisdom would be best served by social media.
The Old Way Of Doing Business. In the past they had used traditional focus groups, where engineers, scientists and business people would gather from around the globe and spend a week or so together, creating new possibilities. What they found was that ,in addition to the expense of flying people from all over the globe, while the conversations were great -- they were more difficult to keep ongoing conversations as the same level of creativity and intensity. Once back at home, the everyday work/home life catches up with everyone. And they clearly saw the need and desire of the collective wisdom to be in more continuous conversations.
Intel decided to use social media as a platform to look at key business factors and sustain these conversations on a continuing basis. Intel engaged key customers in open discussion about how to improve their customer experience. First they found a need for customer's to be able to contact Intel quickly and securely to discuss product or process issues or ideas. And second, Intel found that the customers wanted the ability to engage with each other without involvement from Intel. Of course, privacy and security were of the utmost importance!
Intel began by evaluating the customer experiences, its methods of being in communication with its customers and its ability to target and engage customers and maximize effective, relevant, just-in-time communications.
People often use the end of a decade to say goodbye to trends that have played themselves out, or good riddance to things that have long since passed their cultural expiration dates. I like to use the beginnings of decades for that same purpose. What, we should ask ourselves, is not likely to last beyond the close of this new ten-year cycle?
In data warehousing, the most likely casualty of the Teens will be the very notion of a data warehouse. You can tell that a concept is on its last legs when its proponents spend more time on the defense, fighting definitional trench wars, than evolving it in useful new directions. Here’s a perfect case in point:a recent article by Bill Inmon, self-described “father of the data warehouse,” where he takes pains to specify what is not a data warehouse. Apparently, many of the approaches that we normally implement in our data warehousing architectures—such as subject-specific data marts, dimensional data structures, federated architectures, and real-time data integration—don’t pass muster in Inmon’s way of looking at things. Though he didn’t mention hybrid row-columnar and in-memory databases by name, one suspects that Inmon has a similarly jaundiced view of these leading-edge data warehousing technologies.