A lot of tech vendors – and channel partners – are struggling over what channel partners’ play in the cloud services demand chain is going to be. Technology is decreasingly delivered/consumed in the form of on-premise installation (a function performed by and the original raison d’être of channel partners), and increasingly delivered as-a-service by a service provider. In the software sector, that service provider is typically (but not always) the software vendor (think: salesforce.com).
And, in most cases, for good reason. Software has bugs. Early versions of software can be unstable and unpredictable. In the classic channel-partner-sells-and-installs-software model, the product (the software) remains in the control of the software vendor, i.e., the vendor assumes the risk of customers’ unmet expectations. The license is between the vendor and the customer, and the vendor is on the hook for providing bug fixes and tier-2 and -3 support.
As much as many channel partners would like to act as application hosters (and many of them do – approximately 15% of software is delivered via a hosting model today, and 20% of channel partners today have a hosting business [see “Channel Models In The Era Of Cloud”]), when it comes to early-version or mission-critical software, vendors simply can’t risk putting the as-a-service service level/performance responsibility in the hands of channel partners. Service failures, over which the vendor would have no control, would result in egg (or worse!) on the vendor’s brand, not the channel partner’s. Until tech vendors’ partner programs mature to the point where they can certify partners’ data centers, those vendors are going to be reticent to hand over the data center reins to partners.
Cloud infrastructure-as-a-service (IaaS) is a hot market. Amazon Web Services, now five years old, drives a lot of attention and customer volume, but the vendor strategists at enterprise-facing providers such as IBM, HP, AT&T and Verizon have been building and delivering IaaS offerings. As I’ve studied the market, I’ve heard wildly different types of requirements from buyers and quite a range of offerings from service providers. Yet much of the industry dialogue is about one central idea of what IaaS is – think that’s wrong headed. I found that there were really two buyer types: 1) informal buyers outside of the IT operations/data center manager organizations, such as engineers, scientists, marketing executives, and developers, and 2) formal buyers, the IT operations and data center managers responsible for operating applications and maintaining infrastructure.
With this idea in mind, I set out to test the views of IT infrastructure buyers in the Forrsights Hardware Survey, Q3 2010 and learned that:
After 2+ years of cloud hype, only 6% of enterprises IT infrastructure respondents report using IaaS, with another 7% planning to implement by Q3, 2012. After flat adoption from 2008 to 2009, this represents an approximate doubling from 2009, off a very small base.
Almost two thirds of IT infrastructure buyers themselves don’t believe they are the primary buyer of cloud IaaS! We asked them which groups in their company are using or most interested in cloud IaaS. Only 36% of IT infrastructure buyers listed themselves, while 7% didn’t know. The rest, 58% said that IT developers, Web site owners, business unit owners of batch compute intensive apps, and other business unit developers were more interested in using IaaS than themselves.
Calxeda, one of the most visible stealth mode startups in the industry, has finally given us an initial peek at the first iteration of its server plans, and they both meet our inflated expectations from this ARM server startup and validate some of the initial claims of ARM proponents.
While still holding their actual delivery dates and details of specifications close to their vest, Calxeda did reveal the following cards from their hand:
The first reference design, which will be provided to OEM partners as well as delivered directly to selected end users and developers, will be based on an ARM Cortex A9 quad-core SOC design.
The SOC, as Calxeda will demonstrate with one of its reference designs, will enable OEMs to design servers as dense as 120 ARM quad-core nodes (480 cores) in a 2U enclosure, with an average consumption of about 5 watts per node (1.25 watts per core) including DRAM.
While not forthcoming with details about the performance, topology or protocols, the SOC will contain an embedded fabric for the individual quad-core SOC servers to communicate with each other.
Most significantly for prospective users, Calxeda is claiming, and has some convincing models to back up these claims, that they will provide a performance advantage of 5X to 10X the performance/watt and (even higher when price is factored in for a metric of performance/watt/$) of any products they expect to see when they bring the product to market.
Intel, despite a popular tendency to associate a dominant market position with indifference to competitive threats, has not been sitting still waiting for the ARM server phenomenon to engulf them in a wave of ultra-low-power servers. Intel is fiercely competitive, and it would be silly for any new entrants to assume that Intel will ignore a threat to the heart of a high-growth segment.
In 2009, Intel released a microserver specification for compact low-power servers, and along with competitor AMD, it has been aggressive in driving down the power envelope of its mainstream multicore x86 server products. Recent momentum behind ARM-based servers has heated this potential competition up, however, and Intel has taken the fight deeper into the low-power realm with the recent introduction of the N570, a an existing embedded low-power processor, as a server CPU aimed squarely at emerging ultra-low-power and dense servers. The N570, a dual-core Atom processor, is being currently used by a single server partner, ultra-dense server manufacturer SeaMicro (see Little Servers For Big Applications At Intel Developer Forum), and will allow them to deliver their current 512 Atom cores with half the number of CPU components and some power savings.
Technically, the N570 is a dual-core Atom CPU with 64 bit arithmetic, a differentiator against ARM, and the same 32-bit (4 GB) physical memory limitations as current ARM designs, and it should have a power dissipation of between 8 and 10 watts.
SAP Has Managed A Turnaround After Léo Apotheker’s Departure
In February 2010, after Léo Apotheker resigned as CEO of SAP, I wrote a blog post with 10 predictions for the company for the remaining year. Although the new leadership mentioned again and again that this step would not have any influence on the company’s strategy, it was clear that further changes would follow, as it doesn’t make any sense to simply replace the CEO and leave everything else as is when problems were obviously growing bigger for the company.
I predicted that the SAP leadership change was just the starting point, the visible tip of an iceberg, with further changes to come. Today, one year later, I want to review these predictions and shed some light on 2010, which has become the “Turnaround Year For SAP.”
The 10 SAP Predictions For 2010 And Their Results (7 proved true / 3 proved wrong)
Only a few weeks to go before Forrester’s US EA Forum 2011 in San Francisco in February! I’ll be presenting a number of sessions, including the opening kickoff, where I’ll paint a picture of where I see EA going in the next decade. As Alex Cullen mentioned, I’ll examine three distinct scenarios where EA rises in importance, EA crashes and burns, or EA becomes marginalized.
But the most fun I’ve had preparing for this year’s event is putting together a new track: “Key Technology Trends That Will Change Your Business.” In the past, we’ve focused this conference on the practice of EA and used our big IT Forum conference in the spring to talk about technology strategies, but this year I’ve had the opportunity to put together five sessions that drill down into the technology trends that we think will have significant impact in your environment, with a particular focus on impacting business outcomes. Herewith is a quick summary of the sessions in this track:
The General Services Administration made a bold decision to move its email and collaboration systems to the cloud. In the RFP issued last June, it was easy to see their goals in the statement of objectives:
This Statement of Objectives (SOO) describes the goals that GSA expects to achieve with regard to the
1. modernization of its e-mail system;
2. provision of an effective collaborative working environment;
3. reduction of the government’s in-house system maintenance burden by providing related business, technical, and management functions; and
4. application of appropriate security and privacy safeguards.
GSA announced yesterday that they choose Google Apps for email and collaboration and Unisys as the implementation partner.
So what does this mean?
What it means (WIM) #1: GSA employees will be using a next-generation information workplace. And that means mobile, device-agnostic, and location-agile. Gmail on an iPad? No problem. Email from a home computer? Yep. For GSA and for every other agency and most companies, it's important to give employees the tools to be productive and engage from every location on every device. "Work becomes a thing you do and not a place you go." [Thanks to Earl Newsome of Estee Lauder for that quote.]
With about 41,000 attendees, 1,800 sessions, and a whooping 63,000-plus slides, Oracle OpenWorld 2010 (September 19-23) in San Francisco was certainly a mega event with more information than one could possibly digest or even collect in a week. While the main takeaway for every attendee depends, of course, on the individual’s area of interest, there was a strong focus this year on hardware due to the Sun Microsystems acquisition. I’m a strong believer in the integration story of “Hardware and Software. Engineered to Work Together.” and really liked the Iron Man 2 show-off all around the event; but, because I’m an application guy, the biggest part of the story, including the launch of Oracle Exalogic Elastic Cloud, was a bit lost on me. And the fact that Larry Ellison basically repeated the same story in his two keynotes didn’t really resonate with me — until he came to what I was most interested in: Oracle Fusion Applications!
On September 15th between 11am-12pm EDT Forrester held an interactive TweetJam on the future of cloud computing including Forrester analysts Jennifer Belissent, Mike Cansfield, Pascal Matzke, Stefan Ried, Peter O’Neill , myself and many other experts and interested participants. Using the hashtag #cloudjam (use this tag to search for the results in Twitter), we asked a variety of questions.
We had a great turnout, with more than 400 tweets (at last count) from over 40 unique Tweeter’s. A high level overview of the key words and topics that were mentioned during the TweetJam is visualized in the attached graphic using the ManyEyes data visualization tool.
Below you will find a short summary of some key takeaways and quotes from the TweetJam:
1. What really is cloud computing? Let’s get rid of 'cloud washing!'
Have questions about cloud computing and the top challenges and opportunities it presents to vendors and users? Then join us for an interactive Tweet Jam on Twitter about the future of cloud computing on Wednesday, September 15th, 2010 from 11:00 a.m. – 12:00 p.m. EDT (17:00 – 18:00 CEST) using the Twitter hashtag #cloudjam. Joining me (@hkisker) will be my analyst colleagues Mike Cansfield (@mikecansfield), Pascal Matzke (@pascalmatzke), Thomas Mendel (@drthomasmendel), and Stefan Ried (@stefanried). We’ll share the results of our recent research on the long term future of cloud computing and discuss how it will change the way tech vendors engage with customers.
Looking through the current industry hype around the cloud, Forrester believes cloud computing is a sustainable, long-term IT paradigm. Underpinned by both technology and economic disruptions, we think the cloud will fundamentally change the way technology providers engage with business customers and individual users. However, many customers are suffering from "cloud confusion" as vendors' marketing stretches cloud across a wide variety of capabilities.
To help, we recently developed a new taxonomy of the cloud computing markets (see graphic) to give vendors and customers clear definitions and labels for cloud capabilities. With this segmentation in hand, cloud vendors and users can better discuss the challenges and benefits of cloud computing today and in the future.