The US economy continues to show improvement – for example, today’s news that new jobless claims were near a four-year low. As the economy outlook has improved, so, too, have prospects for the US tech market. In our updated Forrester forecast for US tech purchases, "US Tech Market Outlook For 2012 To 2013: Improving Economic Prospects Create Upside Potential," we now project growth of 7.5% in 2012 and 8.3% in 2013 for business and government purchases of information technology goods and services (without telecom services). Including telecom services, business and government spending on information and communications technology (ICT) will increase by 7.1% in 2012 and 7.4% in 2013.
The lead tech growth category will shift from computer equipment in 2011 to software in 2012 and 2013, with and IT consulting and systems integration services playing a strong supporting role. Following strong growth of 9.6% in 2011, computer equipment purchases will slow to 4.5% in 2012, as the lingering effects of Thailand's 2011 floods hurt parts supply in the first half and the prospect of Windows 8 dampens Wintel PC sales until the fall. Apple Macs and iPad tablets will post strong growth in the corporate market, though, and server and storage should grow in the mid-single digits.
Today's move by Citrix to put its CloudStack IaaS solution into the Apache Foundation says more about the state of the cloud market than it does about OpenStack. As our Fall 2011 Forrsights Hardware Survey shows, about 36% of enterprise IT leaders are prioritizing and planning to invest in IaaS this year. That means they need solutions today and thus service providers and cloud software vendors need answers they can take to market now. OpenStack, while progressing well, simply isn't at this point yet.
Second, Citrix needed to clarify the position of its current open source–based solution. Ever since Citrix joined OpenStack, its core technology has been in somewhat of a limbo state. The code in cloudstack.org overlaps with a lot of the OpenStack code base, and Citrix's official stance had been that when OpenStack was ready, it would incorporate it. This made it hard for a service provider or enterprise to bet on CloudStack today, under fear that they would have to migrate to OpenStack over time. That might still happen, as Citrix has kept the pledge to incorporate OpenStack software if and when the time is right but they are clearly betting their fortunes on cloudstack.org's success.
There are myriad other benefits that come from this move. Two of the biggest are:
Amazon Web Services (AWS) is great, but many of our enterprise clients want those cloud services and values delivered on premise, behind their firewall, which may feel more comfortable for protecting their intellectual property (even if it isn't). AWS isn't very interested in providing an on-premise version of its solution (and I don't blame them). Today's partnership announcement with Eucalyptus Systems doesn't address this customer demand but does give some degree of assurance that your private cloud can be AWS compatible.
This partnership is a key value for organizations who have already seen significant adoption of AWS by their developers, as those empowered employees have established programmatic best practices for using these cloud services — procedures that call AWS' APIs directly. Getting them to switch to your private cloud (or use both) would mean a significant change for them. And winning over your developers to use your cloud is key to a successful private cloud strategy. It also could double your work to design and deploy cloud management solutions that span the two environments.
Cloud providers and many federated IAM practitioners are excited about OAuth, a new(ish) security technology on the scene. I’ve written about OAuth in Protecting Enterprise APIs With A Light Touch. The cheat-sheet list I keep of major OAuth product support announcements already includes items from Apigee, Covisint, Google, IBM, Layer 7, Microsoft, Ping Identity, and salesforce.com. (Did I miss yours? Let me know.)
OAuth specializes in securing API/web service access by a uniquely identified client app on behalf of a uniquely identified user. It has flows for letting the user explicitly consent to (authorize) this connection, but generally relies on authorizing the actions of the calling application itself through simple authentication. So does the auth part of the name stand for authentication, authorization, or what? Let’s go with “all of the above.”
However, OAuth is merely plumbing of a sort similar to the WS-Security standard (or, for that matter, HTTP Basic Authentication). It doesn’t solve every auth* problem known to humankind, not by a long shot. What other IAM solutions are popping up in the API-economy universe? Two standards communities are building solutions on top of OAuth to round out the picture:
CeBIT 2012 kicks off tomorrow — and believe it or not, it’s still the world’s biggest IT show, attracting 339,000 visitors last year and very likely even more this year.
Cloud computing is all over the fair this year (again), but some vendors have managed to move beyond cloud infrastructure and are starting to combine the ease of use, standardization, and opex-based consumption with business software. I had the chance to talk to some vendors last week about their upcoming announcements. Forrester analyst Holger Kisker has already pointed it out in his 10 Cloud Predictions For 2012:
The Wild West of cloud procurement is over! More enterprises and SMBs than ever are discovering a formal strategy to purchase cloud services in 2012. The easiest consolidated way to do this is an app store or cloud marketplace.
Last week it was Dell’s turn to tout its new wares, as it pulled back the curtain on its 12th-eneration servers and associated infrastructure. I’m still digging through all the details, but at first glance it looks like Dell has been listening to a lot of the same customer input as HP, and as a result their messages (and very likely the value delivered) are in many ways similar. Among the highlights of Dell’s messaging are:
Faster provisioning with next-gen agentless intelligent controllers — Dell’s version is iDRAC7, and in conjunction with its LifeCyle Controller firmware, Dell makes many of the same claims as HP, including faster time to provision and maintain new servers, automatic firmware updates, and many fewer administrative steps, resulting in opex savings.
Intelligent storage tiering and aggressive use of flash memory, under the aegis of Dell’s “Fluid Storage” architecture, introduced last year.
A high-profile positioning for its Virtual Network architecture, building on its acquisition of Force10 Networks last year. With HP and now Dell aiming for more of the network budget in the data center, it’s not hard to understand why Cisco was so aggressive in pursuing its piece of the server opportunity — any pretense of civil coexistence in the world of enterprise networks is gone, and the only mutual interest holding the vendors together is their customers’ demand that they continue to play well together.
Corporate CIOs should not ignore the network-centric nature of cloud-based solutions when developing their cloud strategies and choosing their cloud providers. And end users should understand what role(s) telcos are likely to play in the evolution of the wider cloud marketplace.
Like many IT suppliers, telcos view cloud computing as a big opportunity to grow their business. Cloud computing will dramatically affect telcos — but not by generating significant additional revenues. Instead, cloud computing will alter the role of telcos in the value chain irreversibly, putting their control over usage metering and billing at risk. Alarm bells should ring for telcos as Google, Amazon, et al. put their own billing and payment relationships with customers in place.
Telcos must defend their revenue collection role at all costs; failure to do so will accelerate their decline to invisible utility status. At the same time, cloud computing offers telcos a chance to become more than bitpipe providers. Cloud solutions will increasingly be delivered by ecosystems of providers that include telcos, software, hardware, network equipment vendors, and OTT providers.
Telcos have a chance to leverage their network and financial assets to grow into the role of ecosystem manager. To start on this path, telcos will provide cloud-based solutions that are adjacent to communication services they already provide (like home area networking and machine-to-machine solutions), such as connected healthcare and smart grid solutions. Expanding from this beachhead into a broader role in cloud solutions markets is a tricky path that only some telcos will successfully navigate.
We are analyzing the potential role of telcos in cloud computing markets in the research report Telcos as Cloud Rainmakers.
At its recent financial analyst day, AMD indicated that it intended to differentiate itself by creating products that were advantaged in niche markets, with specific mention, among other segments, of servers, and to generally shake up the trench warfare that has had it on the losing side of its lifelong battle with Intel (my interpretation, not AMD management’s words). Today, at least for the server side of the business AMD made a move that can potentially offer it visibility and differentiation by acquiring innovative server startup SeaMicro.
SeaMicro has attracted our attention since its appearance (blog post 1, blog post 2), with its innovative architecture that dramatically reduces power and improves density by sharing components like I/O adapters, disks, and even BIOS over a proprietary fabric. The irony here is that SeaMicro came to market with a tight alignment with Intel, who at one point even introduced a special dual-core packaging of its Atom CPU to allow SeaMicro to improve its density and power efficiency. Most recently SeaMicro and Intel announced a new model that featured Xeon CPUs to address the more mainstream segments that were not for SeaMicro’s original Atom-based offering.
Yesterday, Amazon launched an adjunct to its successful Amazon Web Service (AWS) elastic cloud offering. While we don’t normally comment on every product release, this one is significant — primarily because of who is doing it. The Simple Workflow service (SWF) clearly has nothing to do with Adobe’s Flash offering (although techno-nerds may initially think so, given the acronym).
So what was this all about? The business model is certainly interesting: an elastic, configurable workflow capability that’s distributed across any number of access points. Essentially, this will allow an organization to orchestrate processes in the cloud, linking participants up and down the value chain.
“Amazon Simple Workflow Service (Amazon SWF) is a workflow service for building scalable, resilient applications. Whether automating business processes for finance or insurance applications, building sophisticated data analytics applications, or managing cloud infrastructure services, Amazon SWF reliably coordinates all of the processing steps within an application.”
Pricing is initially free and then transitions into a blended, low-cost consumption model, with charges oriented around execution steps, bandwidth usage, how long the task is active, and signals/markers, etc. With usage charges at around $0.0001 per execution step, this gives you an idea of how small the operating overhead might be.
The Dell brand is one of the most recognizable in technology. It was born a hardware company in 1984 and deservedly rocketed to fame, but it has always been about the hardware. In 2009, its big Perot Systems acquisition marked the first real departure from this hardware heritage. While it made numerous software acquisitions, including some good ones like Scalent, Boomi, and KACE, it remains a marginal player in software. That is about to change.