A couple of months ago I was blogging from sunny Barcelona with the Red Sox 0-6. Now I'm in Barcelona again for our IT Forum, but this month its raining heavily here, while back in UK we officially have a drought. But the good news is that Boston is 6-0, at least in Yankee Stadium. A lot can change in two months.
The same is true in IT. Just now, Microsoft faces threats to its strong market position from many directions, and Steve Ballmer is under pressure, but strong results for its June fourth quarter could deflect the flak. That's one reason why sales teams will have greater incentives than ever to close Enterprise Agreement deals in the next couple of weeks. Hopefully if you're negotiating an EA right now, whether a new deal or a renewal, you've read my report Consider These Five Criteria When Choosing A Microsoft Volume Licensing Program and maybe even had an inquiry call with my colleage Christopher Voce or me. One common question we get is whether the stated deadline to accept an offer is real, or will the same deals be available in the last days of the quarter or even in the subsequent months? The short answers are Yes, it is, and no, they won't." Microsoft has its own deal approval processes that take time to complete, and though it won't want to reject Purchase Orders, it may have problems processing them if they arrive too late. And the deals available almost certainly wont be as good next quarter because sales teams will still have 9 months remaining in which to recoup any shortfall.
What are the right metrics to track the success of a CRM initiative? I just updated my report on this topic for 2011. The report illustrates over 70 different metrics and describes how to link them to business strategies and tactics.
What’s new in the report? My clients are incorporating new measures into their portfolio. In addition to traditional operational metrics, they are adding externally focused customer perception metrics. In particular, I see a rise in adoption of voice of the customer (VoC) metrics and “social metrics”:
We’ve all been through this many times before. So when will it be?
The same thing happens every time. Some shiny new thing gets built up until it’s too big for its britches and then we delight in shooting it down. Or taking it down a few notches until, chastened, it accepts its less-than-lofty position in the divine order of all things IT.
Hadoop is no fad, but it is definitely getting set up for a sober reappraisal — possibly by this time next year, or as soon as a significant number of major EDW vendors roll out their Hadoop products and strategies. I’ve already painted in broad brushstrokes the milestones that Hadoop needs to pass to be considered truly ready for enterprise prime time. I’m reasonably confident that it will meet those challenges over the next two to three years. I’m even willing to meet the open-source absolutists halfway on their faith that the Apache community will be guided by some invisible hand toward a single market-making distro with universal interoperability, peace, love, and understanding.
But even if Hadoop stays on track toward maturation, we’re likely to see the inevitable backlash emerge, spurred by the widespread impatience that usually follows overweening hype. The snarkfest will come as analytics pros start to realize that, promising as this new approach may be, there are plenty of non-Hadoop EDWs that can address the core petabyte-scale use cases I laid out. Many IT practitioners will ask why they should pay good money for a new way of doing things, with all the concomitant disruptions and glitches, when they can simply repurpose their investments in platforms like Teradata, Oracle, IBM, and Microsoft.
I was recently chatting with Jim Harris, the well-respected blogger-in-chief of the Obsessive-Compulsive Data Quality blog, about one of our favorite topics: data governance best practices. Our conversation migrated to one of data governance’s biggest challenges: how to balance bureaucracy and business agility.
So Jim and I thought it would be fun to tackle this dilemma in a Star Wars-themed debate across each of our individual blog platforms, with Jim taking the position for “Agility” as the Rebellion and me taking the opposing position for “Bureaucracy” as the Empire.
Note: Yes, most conversations between self-proclaimed data geeks tend to result in Star Wars or Star Trek parallels . . . and I lost the coin toss. Thankfully, I found StarWars.com to help me with some of my rusty Star Wars facts!
Disclaimer: Remember, this is meant to be a true debate format, where Jim and I are intentionally arguing polar opposite positions with full knowledge of the reality that data governance success requires effectively balancing bureaucracy and agility.
Please take the time to read both of our blog posts, then we encourage your comments — and your votes (see the poll below).
The Canadian market for purchases of information and communications technologies (ICT) by businesses and governments is about 10% the size of the US ICT market, and only about 3% of the global ICT market. Still, it is an important market because of the sophisticated level of its tech adoption (i.e., its readiness to adopt advanced technologies) and its proximity to the US market.
Canada's ICT market growth rates of 6.2% in 2011 and 2012 growth of 8.1% in Canadian dollars will be very similar to the US ICT market growth in US dollars in the same periods. With the Canadian dollar having gained strength against the US dollar, that means that US vendors will see even stronger Canadian revenue growth when they convert their Canadian sales back into US dollars.
Communications equipment and software will have the strongest growth in 2011, at 10.5% and 8.4%, respectively. Computer equipment growth of 4.4% and telecommunications services growth of 2.2% will be the weakest product categories.
Many IT end-user companies deployed hard tokens at a time when intermediate-risk choices were thinner on the ground, and some of these companies would have benefited from a more granular approach anyway. In general, we are seeing companies moving towards risk-based authentication augmented by mobile soft tokens (sometimes called from a mobile application through an API). These software-only solutions are easier and cheaper to deploy, particularly if the target population is on smartphones, and a lot easier to patch in case of an attack. Interestingly, risk-based authentication is now asked about not only in the B2C context (which was a norm about a year ago), but also in the B2E context as well. Right now, end-user companies are thinking about:
How they can ditch hardware tokens altogether; and
How can they can move risk-based authentication, and increasingly authorization (fraud management), into the cloud.
Government investment in ICT is growing. At Forrester we expect the overall government ICT budget to reach $346 billion in 2011, growing to $382 billion in 2012 or by about 10%. That makes government one of the largest vertical industries – almost double the size of the retail industry, well above the telecom industry and actually behind only professional services and financial services. As government soul searching intensifies in the wake of the financial crisis, and in light of global competition and economic recovery, we expect the dawn of a new government – not “big government” but a government that operates more effectively and certainly more efficiently. What does that look like, and what does that entail? We see three primary trends:
A move to greater performance management processes with an emphasis on KPIs for specific programs rather than just budget targets.
An increased dependence on technology but with an eye to rationalization and consolidation, and an increased role of a centralized CIO to coordinate technology adoption;
And, a growing adoption of enterprise management tools with visibility not only into department level programs but including executive dashboards to enable a holistic view of the government.
As a geographic unit, the market for business and government purchases of information and communications technologies (ICT) in Western and Central Europe will grow by 3.8% in 2011 (measured in euros), compared with 6.4% growth in the US (measured in US dollars). Excluding slow-growing telecommunications services, the information technology (IT) market in Western and Central Europe will grow by 4.5% in euros vs. the 7.4% growth in US dollars in the US (see June 7, “European Information And Communications Technology Market 2011 To 2012 -- The North-South Divide Persists, With Wide Variations In Country Information And Communications Technology Growth”).
What’s clear is that Hadoop has already proven its initial footprint in the enterprise data warehousing (EDW) arena: as a petabyte-scalable staging cloud for unstructured content and embedded execution of advanced analytics. As noted in a recent blog post, this is in fact the dominant use case for which Hadoop has been deployed in production environments.
Yes, traditional (Hadoop-less) EDWs can in fact address this specific use case reasonably well — from an architectural standpoint. But given that the most cutting-edge cloud analytics is happening in Hadoop clusters, it’s just a matter of time — one to two years, tops — before all EDW vendors bring Hadoop into their heart of their architectures. For those EDW vendors who haven’t yet fully committed to full Hadoop integration, the growing real-world adoption of this open-source approach will force their hands.
Where the next-generation EDW is concerned, the petabyte staging cloud is merely Hadoop’s initial footprint. Enterprises are moving rapidly toward the EDW as the hub for all advanced analytics. Forrester strongly expects vendors to incorporate the core Hadoop technologies — especially MapReduce, Hadoop Distributed File System, Hive, and Pig — into their core architectures. Again, the impressive growth in MapReduce as a lingua franca for predictive modeling, data mining, and content analytics will practically compel EDW vendors to optimize their platforms for MapReduce, alongside high-performance support for SAS, SPSS, R, and other statistical modeling languages and formats. We see clear signs that this is already happening, as with EMC Greenplum’s recent announcement of a Hadoop product family and indications from some of that company’s competitors that they have similar near-term road maps.
Problems don’t care how you solve them. The only thing that matters is that you do indeed solve them, using any tools or approaches at your disposal.
When people speak of “Big Data,” they’re referring to problems that can best be addressed by amassing massive data sets and using advanced analytics to produce “Eureka!” moments. The issue of what approach — Hadoop cloud, enterprise data warehouse (EDW), or otherwise — gets us to those moments is secondary.
It’s no accident that Big Data mania has also stimulated a vogue in “data scientists.” Many of the core applications of Hadoop are scientific problems in linguistics, medicine, astronomy, genetics, psychology, physics, chemistry, mathematics, and artificial intelligence. In fact, Yahoo’s scientists not only had a predominant role in developing Hadoop but — as exploratory problem-solvers — they are active participants in Yahoo’s efforts to evolve Hadoop into an even more powerful scientific cloud platform.
The problems that are best suited to Hadoop and other Big Data platforms are scientific in nature. What they have in common is a need for analytical platforms and tools that can rapidly scale out to the petabyte level and support the following core features: