Ah, memories. I remember the late, great Eighties, early in my analyst career, when I had my first brush with what was later known as “groupware.” It was a LAN-based package, “The Coordinator,” from Action Technologies. The architecture of the software wasn’t as important as the linguistic theory on which it was built: the notion that groups cultivate intelligence by structuring their internal conversations to achieve common goals.
Essentially, the package required people to tag every e-mail they sent based on whether it constituted a discussion of possibilities, a request for clarification, or a request for action—and it tracked these threads so that everybody knew the goal-oriented status of every conversation. As you can probably guess, this was a heavy-handed way of getting people to come to agreement. Software shouldn’t dictate how people choose to interact: real-world conversation’s far too complex and convoluted for that. Most people don’t like being forced to rephrase or reconceptualize how they communicate with others. In fact, most of us users simply defaulted to sending messages that discussed open-ended possibilities, rather than engage in a fussy protocol of formal requests and offers.
On January 4, 2010, Oracle announced its acquisition of Silver Creek Systems, a small private software company focusing on product data quality, which Oracle plans to add to its Oracle Data Integration offering. In our recent research, “It’s Time To Revisit Product Information Management”, we discussed how Forrester believes Silver Creek holds a virtual monopoly in delivering advanced product data quality capabilities, unmatched by other customer data-centric data quality vendors in the market. Due to this, many MDM, PIM and data quality software vendors, including Oracle, had relied on Silver Creek as a strategic partner to add credibility in product data quality. And as we accurately predicted in that research, Silver Creek has now been acquired which will introduce a significant challenge to these partners.
Were your hopes for growing adoption of green IT dashed by the non-agreement at COP 15 in Copenhagen? Are you dismayed by the weak prospects for cap-and-trade legislation in the US during 2010? Forrester's latest Green IT survey results give us some reason for optimism -- it turns out that regulatory compliance is a weak motivation for companies' pursuit of more sustainable computing operations.
When we asked IT practitioners at 600 enterprises around the world about their top motivations for pursuing green IT operations, regulatory compliance was the 7th-most frequently cited reason, with just 16% of respondents. What's at the top? Cost and cost. Reducing energy expenses (66%) and reducing other IT operating expenses (42%) have been the strongest drivers for green IT since we began our survey work on this topic in 2007. See the full survey results in our latest Green IT Market Overview report, here.
So fear not, even in the absence of significant regulatory or policy moves this year, good old-fashioned business motivators like profitability and customer demand will continue to push companies to adopt more sustainable processes and practices -- in their IT organization and beyond.
Software vendors like to claim that their sales proposals are highly confidential, For Your Eyes Only or even, if you prefer the Coen brothers to Bond, Burn After Reading. I help dozens of clients every year with software negotiations, but I cant do that unless they share with me the vendor’s proposal, including price details and contract terms. Many clients are reluctant to do this, worried that doing this might break confidentiality clauses in their agreement.
A hot topic of debate among customer management and business process thought leaders right now is ascertaining the business value of "social CRM." Social technologies are proliferating rapidly and three-quarters of US online adults now use social technologies in some form. Cutting through all the hype, my clients are challenged to make hard decisions about the level of investment they should make in Social Computing technologies like blogs, wikis, forums, customer feedback tools, social networking sites, and customer community platforms. And they want to know how these new capabilities should be, and can be, integrated with their transactional CRM systems.
We have just published a summary of our research and define the seven steps to success for strategizing, selecting, and deploying social CRM solutions:
Initiate social CRM experiments immediately. Define a near-term opportunity to apply social CRM ideas to a customer-facing challenge at your company. Build some practical experience that will break out of your of old mindsets. Refine your strategies later as new insights emerge. For example, 10 years ago, Electronic Arts recognized that could not cope with the anticipated tenfold increase in customer support inquiries as the result of launching large-scale online multiplayer games. No commercial solutions were available to help at the time, so Electronic Arts began experimenting and developing its own solutions. Trying new ideas and discarding the old, EA actively worked to gain hands-on experience by actively participating in the virtual worlds of its social game players.
Crafting a truly comprehensive analytics environment is a bit like staring deeply into the night sky. When you try to absorb the billions of celestial objects out there—all of them at different ages and stages in their respective life cycles--you risk driving yourself insane. Your complex field of view contains the deep past, present, and future in one glorious, glowing glimpse.
Increasingly, the complex event processing (CEP) market, as a segment of the analytics arena, suffers from this “all too much” problem. This is not a slap against the technology itself, which is mature, or the growing list of CEP vendors, who offer many sophisticated solutions. Indeed, many CEP vendors now offer tools for viewing the deep present, consisting of myriad streams of real-time events, and the deep past, in the form of access to historical information pulled from many data warehouses, marts, and other repositories. And some—most notably, IBM with its InfoSphere Streams technology--now support visualization of the deep future, through its ability to apply predictive models to real-time event streams.
The core problem with today’s CEP offerings is that many of them are power tools, not solutions suitable for the mass business market. This same problem confronts established vendors of predictive analytics and data mining (PA/DM) tools, whose core user base consists of statisticians, mathematicians, and other highly educated analytics professionals. No one denies that traditional CEP and PA/DM tools are the analytical bedrock of mission-critical applications in diverse industries. But I challenge you to point to a single case study where they are used directly by the CEO, senior executives, or any other casual user, rather than indirectly through being embedded in some custom application.
To the surprise of no one who pays even cursory attention to mobile phones, today Google announced the Nexus One phone and their new Google phone store. In case you were hiding out, here are the event highlights:
In this podcast, Clay Richardson walks through five key challenges that process professionals need to address to be successful with business process management in 2010. Topics include lean principles and lean thinking, effectively connecting process initiatives with value drivers, the importance of data, and process-based management.
Most people know Intel as a provider of microprocessor for large manufacturers like Apple, Dell and HP. A large portion of their business is driven by the elaborate network of customers - resellers, partners, etc.. from around the globe. To remain innovative Intel must enroll, engage, and entertain the most brilliant minds to continue to push the boundaries of technology. They realized the ability to collect and harness the power of that collective wisdom would be best served by social media.
The Old Way Of Doing Business. In the past they had used traditional focus groups, where engineers, scientists and business people would gather from around the globe and spend a week or so together, creating new possibilities. What they found was that ,in addition to the expense of flying people from all over the globe, while the conversations were great -- they were more difficult to keep ongoing conversations as the same level of creativity and intensity. Once back at home, the everyday work/home life catches up with everyone. And they clearly saw the need and desire of the collective wisdom to be in more continuous conversations.
Intel decided to use social media as a platform to look at key business factors and sustain these conversations on a continuing basis. Intel engaged key customers in open discussion about how to improve their customer experience. First they found a need for customer's to be able to contact Intel quickly and securely to discuss product or process issues or ideas. And second, Intel found that the customers wanted the ability to engage with each other without involvement from Intel. Of course, privacy and security were of the utmost importance!
Intel began by evaluating the customer experiences, its methods of being in communication with its customers and its ability to target and engage customers and maximize effective, relevant, just-in-time communications.