We all know the conventional wisdom about cloud computing: it's cheap, fast and easy. But is it really that much cheaper? Or is it simply optics that make it appear cheaper?
Optics can absolutely change your perception of the cost of something. Just think about your morning jolt of coffee. $3.50 for a no-foam, half-caf, sugar-free vanilla latte doesn't seem that expensive. It's a small daily expense when viewed by the drink. It appears even cheaper if you pay for it with a loyalty card where you don't even have to fork over the dough and the vanilla shot is free. But what if you bought coffee like IT buys technology? You would pay for it on an annual basis. That $3.50 latte would now be about $900/year. For coffee? How many of you would go for that deal? That's optics and it plays right into the marketing hands of the public cloud services your business is consuming today.
But optics aside, is that $99/month per user SaaS application just another $20,000 per year enterprise application? Is that $0.25 per hour virtual machine just another $85 per year hosted VM? No, it's not the same. Because the pricing models are not just optics but an indication of the buying pattern that is possible. If you buy it the same way you do traditional IT, then yes, the math says, there's little difference here. The key to cloud economics is to not buy the cloud service the same way you do traditional IT. The key to taking advantage is to not statically and rotely consume the cloud. Instead, consume only what you need when you need it — and be diligent about turning off when you aren't.
In 2011, my colleague James Staten and I published two light-weight vendor assessments on the private cloud and public cloud market. These solutions sit at the extremes of the IaaS market. To kick off 2013, I published a full vendor evaluation of a market that sits in between these two IaaS deployment types — hosted private cloud. Forrester's Forrsights Hardware Survey, Q3 2012 showed that 46% of enterprises are prioritizing investments in private clouds in 2013. While slightly more than half plan to build a private cloud in their own data center, more than 25% said they prefer to rent one. Hosted private cloud opens the door to a variety of benefits: 1) You reach cloud from day one. 2) Compute is dedicated from other clients. 3) It can enable future hybrid scenarios. 4) Easier-to-meet licensing and compliancy requirements. 5) Outsourcing the setup of the cloud and management of the infrastructure to focus on support and utilization.
Overall this report revealed no leaders, but it did show some strengths and weaknesses across the market and provide framework and sample criteria to assess vendors within this space. This research process also revealed some unexpected nuances within this space:
Hosted private cloud and virtual private cloud are often used interchangeably within the market — despite being distinct deployment types.
Level and method of dedication varies greatly by solution.
Layers managed differ greatly by solution.
Although agility is a benefit, few enable self-service access to resources to its end users. Ticket-based request systems are common.
Many enterprises are using hosted private cloud for some unexpected advantages:
Amazon Web Services (AWS) held its first global customer and partner conference, re:Invent, in late November in Las Vegas, attracting approximately 6,000 attendees. While aimed squarely at developers, AWS highlighted two key themes that will appeal directly to enterprise IT decision-makers:
Continued global expansion. AWS cites customers in 190 countries, but the company is clearly pushing for greater penetration into enterprise accounts via aggressive global expansion. AWS now has nine regions (each of which has at least one data center), including three in Asia Pacific: Tokyo, Singapore, and Sydney.
An expanded services footprint within customer accounts. The major announcement at re:Invent was a limited preview of a new data warehouse (DW) service called Amazon Redshift — a fully managed, cloud-based, petabyte-scale DW. As my colleague Stefan Ried tweeted during the event, with a limit of 1.6 petabytes, this is not just for testing and development — this is a serious production warehouse.
As the end of 2012 approaches there is one clear takeaway about the cloud computing market — enterprise use has arrived. Cloud use is no longer solely hiding in the shadows, IT departments are no longer denying it’s happening in their company, and legitimate budgeting around cloud is now taking place. According to the latest Forrsights surveys nearly half of all enterprises in North America and Europe will set aside budget for private cloud investments in 2013 and nearly as many software development managers are planning to deploy applications to the cloud.
So what does that mean for the coming year? In short, cloud use in 2013 will get real. We can stop speculating, hopefully stop cloudwashing, and get down to the real business of incorporating cloud services and platforms into our formal IT portfolios. As we get real about cloud, we will institute some substantial changes in our cultures and approaches to cloud investments. We asked all the contributors to the Forrester cloud playbook to weigh in with their cloud predictions for the coming year, then voted for the top ten. Here is what we expect to happen when enterprise gets real about cloud in 2013:
The year 2012 brought a significant amount of growth in enterprise use of cloud services but did it fulfill our expectations? With just five weeks left in the year, it’s time to reflect on our predictions for this market in 2012. Back in November 2011 we said that the cloud market was entering a period of rebellion, defiance, exploration, and growth, not unlike the awkward teenage years of a person’s life. The market certainly showed signs of teen-like behavior in 2012, but many of the changes we foresaw, it appears, will take several years to play out.
Out of all the inquiries I get from Forrester enterprise clients, the above question is by far the most common these days. However, the question shows that we have a lot to learn about true public cloud environments.
I know I sound like a broken record when I say this, but public clouds are not traditional hosting environments, and thus you can't just put any app that can be virtualized into the cloud and expect the same performance and resiliency. Apps in the cloud need to adapt to the cloud - not the other way around (at least not today). This means you shouldn't be thinking about what applications you can migrate to the cloud. That isn't the path to lower costs and greater flexibility. Instead, you should be thinking about how your company can best leverage cloud platforms to enable new capabilities. Then create those new capabilities as enhancements to your existing applications.
This advice should sound familiar if you have been in the IT business for more than a decade. Back in 1999 we did the same thing. As the Web was emerging, we didn't pick up our UNIX applications and move them to the web. We instead built new web capabilities and put them in front of the legacy systems (green screen scrapers, anyone?). The new web apps were built in a new way - using the LAMP stack, scaling out, and being geographically dispersed through hosting providers and content delivery networks. We learned new programming architectures, languages, and techniques for availability and performance. Cloud platforms require the same kind of thinking.
If you have dismissed Microsoft as a cloud platform player up to now, you might want to rethink that notion. With the latest release of Windows Azure here at Build, Microsoft’s premier developer shindig, this cloud service has become a serious contender for the top spot in cloud platforms. And all the old excuses that may have kept you away are quickly being eliminated.
In typical Microsoft fashion, the Redmond, Washington giant is attacking the cloud platform market with a competitive furor that can only be described as faster follower. In 2008, Microsoft quickly saw the disruptive change that Amazon Web Services (AWS) represented and accelerated its own lab project centered around delivering Windows as a cloud platform. Version 1.0 of Azure was decidedly different and immature and thus struggled to establish its place in the market. But with each iteration, Microsoft has expanded Azure’s applicability, appeal, and maturity. And the pace of change for Windows Azure has accelerated dramatically under the new leadership of Satya Nadella. He came over from the consumer Internet services side of Microsoft, where new features and capabilities are normally released every two weeks — not every two years, as had been the norm in the server and tools business prior to his arrival.
Well if you're going to make a dramatic about face from total dismissal of cloud computing, this is a relatively credible way to do it. Following up on its announcement of a serious cloud future at Oracle Open World 2011, the company delivered new cloud services with some credibility at this last week's show. It's a strategy with laser focus on selling to Oracle's own installed base and all guns aimed at Salesforce.com. While the promise from last year was a homegrown cloud strategy, most of this year's execution has been bought. The strategy is essentially to deliver enterprise-class applications and middleware any way you want it - on-premise, hosted and managed or true cloud. A quick look at where they are and how they got here:
The long-rumored changing of the guard at VMware finally took place last week and with it came down a stubborn strategic stance that was a big client dis-satisfier. Out went the ex-Microsoft visionary who dreamed of delivering a new "cloud OS" that would replace Windows Server as the corporate standard and in came a pragmatic refocusing on infrastructure transformation that acknowledges the heterogeneous reality of today's data center.
Paul Maritz will move into a technology strategy role at EMC where he can focus on how the greater EMC company can raise its relevance with developers. Clearly, EMC needs developer influence and application-level expertise, and from a stronger, full-portfolio perspective. Here, his experience can be more greatly applied -- and we expect Paul to shine in this role. However, I wouldn't look to see him re-emerge as CEO of a new spin out of these assets. At heart, Paul is more a natural technologist and it's not clear all these assets would move out as one anyway.
On July 11, 2012, SingTel launched its PowerON Compute cloud service in Hong Kong. While certainly interesting on its own, I believe this announcement is particularly noteworthy as a harbinger of things to come.
Some key points to consider:
As a hybrid offering, PowerON Compute is a dynamic infrastructure services solution hosted in SingTel’s data centers in Singapore, Australia, and now Hong Kong. The computing resources (e.g., CPU, memory, storage) can be accessed either via a public Internet connection or a private secured network.
This announcement confirms the findings of my February 2012 report, “Sizing the Cloud Markets in Asia Pacific”: that market demand for cloud-based computing resources in Asia Pacific (AP) will rapidly shift from infrastructure-as-a-service (IaaS) to dynamic infrastructure services.