It was only about a year ago when Larry Ellison was confusing the OpenWorld audience with the “cloud in a box” approach, and only a very few CIOs managed to turn a large Oracle landscape into a real private cloud based on an opex model to their business units. But a lot has changed since last year.
Last week Verizon held its Verizon Developer Community (VDC) Conference in Las Vegas, where the company unveiled an updated and newly branded Verizon Apps store, which replaces the VCast app store. The Verizon Apps store includes improved search capabilities through a partnership with Chomp. Verizon certifies the apps in the store and is reducing the time necessary to test and install new applications to within two weeks. The Verizon Apps store will be accessible on Droid smartphones, and users can purchase apps and pay for them through their Verizon phone bill. Verizon is also creating a new private application store for businesses, which will include applications built by enterprises and third parties to address the specific needs of line of business workers within the organization. These enterprise app stores will provide yet another distribution channel for developers.
It is important to recognize that mobile application developers have a lot of choices regarding which mobile storefronts they use to distribute their applications, including the Android Market, the Apple App Store, and app stores from many other telecom operators and mobile device manufacturers. To capture the mindshare of developers and facilitate the success of the store, it is important to:
1) Provide marketing opportunities for developers. Competitive application stores include hundreds of thousands of applications, making it difficult for developers to get visibility for their applications. Developers also want to ensure their applications are seen by the correct user segments. Offering segmented marketing programs to ensure relevant users have visibility into the appropriate applications is a way to address this issue.
An important prerequisite for a full cloud broker model is the technical capability of cloud bursting:
Cloud bursting is the dynamic relocation of workloads from private environments to cloud providers and vice versa. A workload can represent IT infrastructure or end-to-end business processes.
The initial meaning of cloud bursting was relatively simple. Consider this scenario: An enterprise with traditional, non-cloud infrastructure is running out of infrastructure and temporarily gets additional compute power from a cloud service provider. Many enterprises have now established private clouds, and cloud bursting fits even better here, with dynamic workload relocation between private clouds, public clouds, and the more private provider models in the middle; Forrester calls these virtual private clouds. The private cloud is literally bursting into the next cloud level at peak times.
An essential step before leveraging cloud bursting is properly classifying workloads. This involves describing the most public cloud level possible, based on technical restrictions and data privacy needs (including compliance concerns). A conservative enterprise could structure their workloads into three classes of cloud:
Productive workloads of back-office data and processes, such as financial applications or customer-related transactions:These need to remain on-premises. An example is the trading system of an investment bank.
Commoditization and economies of scale drive cloud computing. We've seen before what commoditization means to the IT industry: It stimulates standardization and consolidation to serve a volume market. The success of x86-based CPUs, which dominate everything from data centers to high-performance computing, is a common example. In the software industry, commoditization raised standards and led to the commercial adoption of open source.
It is impossible to imagine a corporate data center without Linux. Even if it’s not used under the core applications, Linux is present in web servers, routers, firewalls, and storage appliances. Upcoming cloud computing providers face increasing price pressures and must therefore provide greater economies of scale than the majority of corporate data centers. They use Linux heavily as an operating system today.
But open source goes beyond Linux for cloud providers. Many providers of public cloud services (or the more local flavor of virtual private clouds) have realized that developing a full cloud computing management stack requires significant software development efforts. Some of the leading midrange cloud providers, such as Rackspace, as well as several cloud component vendors (such as cloud billing systems) support an open source initiative called OpenStack. The goal is to provide a full cloud management stack based on open source for cloud providers and large-scale corporate data centers, such as NASA. What a Linux distribution was for a traditional server, OpenStack is for a full cloud provider business model, regardless of privacy level — public, private, or virtual private.
The Nebula appliance announced today jumps right into this space and provides a standardized hardware configuration for OpenStack implementations. It offers scaled-out compute power based on commoditized x86 CPUs and standardizes a configuration of switches and other components to glue a large number of these CPUs together. The new VC-backed startup will thus compete head to head with EMC’s Vblock and Microsoft’s Azure appliance; neither of these are based on open source, and the latter isn’t really on the market yet.
But Nebula is more than just a hardware deliverable. Its mission is to transparently standardize the cloud hardware stack. Basically, it’s nothing more than the complex specification Microsoft worked out with its hardware partners (Dell, Fujitsu, and HP) to deliver the Azure appliance to local cloud providers and large-scale private clouds. However, Nebula’s openness is the differentiator; it reminds me a bit of IBM’s approach around the original personal computer back in the 1970s. Sure, it enabled hardware competitors to produce compatible PCs — but it also brought mass adoption of the PC, outperforming Apple over four decades.
If Nebula delivers a compelling price point, it has an appealing approach that could gain significant share in the growing cloud hardware market. If the new company aims to spur a revolution similar to that of the PC, its founders need to tweak their strategy soon:
Yesterday, Amdocs, a leading provider of customer experience systems, agreed to acquire Bridgewater Systems, a provider of policy, subscriber management, and network control solutions for mobile and convergent service providers, for US$215 million. The combined solution resulting from this acquisition will combine Amdocs’ existing BSS/OSS and customer experience solutions with network level information from Bridgewater’s solution. The integrated organization will provide service providers with an enhanced ability to monetize data services and to support complex pricing strategies across multiple devices, users, and networks. From a customer perspective, Amdocs and Bridgewater share some key service provider customers including Bell Mobility, Sprint, and Telstra.
Ericsson made a $1.15 billion offer to acquire Telcordia, a provider of operations support systems/business support systems (OSS/BSS) for telecom service providers. The Telcordia acquisition enables Ericsson to establish a leading position providing service fulfillment, service assurance, network optimization, and real-time charging services to wireless and wireline operators worldwide. Ericsson will gain Telcordia’s 200+ customers in 55 countries, including AT&T and Verizon in North America. The acquisition will also augment Ericsson’s managed service capabilities, which are currently used by firms including Sprint and Telefonica Brazil.
BSS/OSS solution demand is increasing due to communication service provider requirements to deploy a variety of converged services and applications to customers across fixed and wireless networks in a seamless manner. Telcordia’s OSS software solutions, combined with Ericsson’s global position, marketing clout, and existing billing services will enable the combined organization to offer a more comprehensive range of BSS/OSS solutions. This acquisition will also change the competitive and fragmented BSS/OSS vendor landscape. Ericsson will be in a better position to compete against key BSS/OSS vendors, including Amdocs, Huawei, and Nokia Siemens Networks, as well as systems integrators, including Accenture, IBM, and Hewlett-Packard, that also offer BSS/OSS solutions.
Last week I attended the TM Forum Management World 2011 conference, which was held for the first time in Dublin, Ireland. Over 3,500 attendees including nearly 200 service providers participated in the event, the largest group ever to attend the conference, which is sponsored by the TM Forum, a global industry association focused on helping simplify the complexity of running a service provider’s business. For me, the highlights of the event were in the renewed positive energy and focus of the delegates, vendors, and service providers in identifying opportunities for communication service providers to innovate and collaborate in new ways in order to address emerging opportunities in the communications industry.
Keynote presentations during the forum highlighted the diverse perspectives from traditional telecom operators and new participants in the communications industry. For example, Kevin Peters, Chief Marketing Officer of AT&T Business Solutions highlighted the network as a critical asset that is global, efficient, and robust with the ability to connect billions of smartphone devices and the growing number of machine-to-machine (M2M) end points. These connected devices use the network to provide intelligence and insight into the status of objects, items, and people. Stephen Shurrock, Chief Executive Officer, Telefonica O2 Ireland, highlighted the ability to deliver a wide variety of new mobile applications and services.
Today, Forrester has published its report “Sizing The Cloud”. This report applies Forrester’s sizing methodology for emerging markets to the cloud computing market for the first time. This methodology allows us not only to apply our growth assumptions to the current numbers for cloud markets, but also to take into account when those markets will hit saturation and see commoditization, affecting the price of a cloud service over time. The result? The first serious attempt to size the cloud computing market for the next 10 years.
Forrester’s Forrsights Software Survey, Q4 2010 has quantified for the first time how enterprise demand is shifting from traditional licensing models to subscriptions and other licensing models, such as financing and license leasing. However, the shift to subscriptions for business-applications-as-a-service is the major driver of this change. Traditional enterprise licenses are slowly decreasing, and Forrester predicts that subscriptions for SaaS applications will drive alternative license spending up to 29% — as early as 2011. This demand-side change goes beyond front-office applications like CRM. In 2011 and 2012, enterprises will opt for “as-a-service” subscriptions for more back-office applications, such as ERP, instead of licensed and on-premise installations. Detailed data cuts by company size and region are available to clients from our Forrsights service.
Base: 622 (2007), 1,026 (2008), 537 (2009), and 930 (2010) software decision-makers predicting license spending for the coming year Source: Enterprise And SMB Software Survey, North America And Europe, Q3 2007; Enterprise And SMB Software Survey, North America And Europe, Q4 2008; Enterprise And SMB Software Survey, North America And Europe, Q4 2009; Forrsights Software Survey, Q4 2010
What does this means for existing independent software vendors (ISVs) and infrastructure vendors?