B2B communication, with its original form of EDI messages, is the oldest and unfortunately the least flexible form of integration between systems and different enterprises. Many enterprises run B2B gateways on-premises or have managed service contracts for “their instance of their B2B Hub.”
I’ve received over the past months an increasing number of inquiries from Forrester clients asking for the future of this approach and the market trend. This is what I usually explain:
Your future cloud/legacy integration should cover your business partner and your SaaS applications. Cloud computing is disrupting the integration space! Why? Traditionally, you had two very distinguished integration scenarios. Either, it was about the integration between multiple systems within your enterprise — middleware software, with product categories like EAI, ESB, CIS, and BPM, was the matching solution, as all systems have been on premises in the past. Or, it was about the integration with your business partners — the well-established B2B/EDI gateways and managed services were the matching solution over the Internet (or VANs). However, cloud computing disrupted the space already: Suddenly parts of your business unit’s applications are in the cloud on packaged SaaS applications, and they needed to be integrated with your on-premises legacy. Or, you and your business partners even use the same SaaS applications, and B2B traffic is as simple as moving data from one tenant to the other tenant on the same cloud platform. To face this trend of an increasing variety of integration, a good cloud integration strategy should look at synergies between the cloud/legacy integration scenarios with your business partners and the SaaS tenants of your own enterprise holistically!
Bridgekeeper: "What ... is your name?"
Traveler: "John Swainson of Dell."
Bridgekeeper: "What ... is your quest?"
Traveler: "Hey! That's not a bad idea!"
We suspect Dell's process was more methodical than that!
This acquisition was not a surprise, of course. All along, it has been obvious that Dell needed stronger assets in software as it continues on its quest to avoid the Gorge of Eternal Peril that is spanned by the Bridge of Death. When the company announced that John Swainson was joining to lead the newly formed software group, astute industry watchers knew the next steps would include an ambitious acquisition. We predicted such an acquisition would be one of Swainson's first moves, and after only four months on the job, indeed it was.
In typical Microsoft fashion, they don't catch a new trend right with the first iteration but they keep at it and eventually strike the right tone and in more cases than not, get good enough. And often good enough wins. That seems the be the pattern playing out with Windows Azure, its cloud platform.
Just three months after SAP acquired SuccessFactors, a cloud leader for human capital management solutions, for $3.4 billion, it has now announced the acquisition of Ariba, a cloud leader for eProcurement solutions, for another $4.3 billion. Now, $7.7 billion is a lot of money to spend in a short amount of time on two companies that hardly make any profit. But it’s all for the cloud, which means it’s for the future business opportunity in cloud computing services. So far, so good; SAP has invested and acquired quite a number of cloud companies over the past years: Frictionless, Clear Standards, Crossgate, etc. The difference in this most recent acquisition is the big overlap with existing solutions and internal R&D.
Following the first wave of cloud acquisitions, SAP was sitting amid a zoo of cloud solutions, all based on different platforms: ePurchasing, CRM-OnDemand, BI-OnDemand, Carbon Impact, ByDesign, Streamwork . . . They all used very different technology, resulting in big integration and scale challenges behind the scenes. The market welcomed with open arms SAP’s announcement 1.5 years ago that it would consolidate its cloud strategy on the new NetWeaver platform for both ABAP- and Java-based cloud solutions.
On May 15, 2012, the Infocomm Development Authority (IDA) of Singapore announced that it would award its much-awaited externally hosted g-cloud infrastructure five-year tender to SingTel. My colleague Jennifer Belissent and I published a report on g-cloud opportunities in Asia Pacific late last year that highlighted Singapore as one of the governments leading the way toward g-cloud adoption in the region.
Some key highlights from the Singapore g-cloud contract:
SingTel will be responsible for all of the capex- and opex-related costs needed to build and manage the central infrastructure from its own data center in Singapore.
Singtel will provide a central “G-Cloud Service Portal” to all government organizations and departments to access central g-cloud services (computing, storage, database, archiving, networking, and other basic resources) and derive revenue based on a subscription model.
The Singapore government has not committed to any particular minimum g-cloud usage level.
SingTel will provide the required training to government departments on g-cloud functioning.
SaaS vendors must collect customer insights for innovation and compliance.
As of the end of last year, about 30% of companies from our Forrsights Software Survey, Q4 2011, were using some software-as-a-service (SaaS) solution; that number will grow to 45% by the end of 2012 and 60% by the end of 2013. The public cloud market for SaaS is the biggest and fastest-growing of all of the cloud markets ($33 billion in 2012, growing to $78 billion by the end of 2015).
However, most of this growth is based on the cannibalization of the on-premises software market; software companies need to build their cloud strategy or risk getting stuck in the much slower-growing traditional application market and falling behind the competition. This is no easy task, however. Implementing a cloud strategy involves a lot of changes for a software company in terms of products, processes, and people.
A successful SaaS strategy requires an open architecture (note: multitenancy is not a prerequisite for a SaaS solution from a definition point of view but is highly recommended for vendors for better scale) and a flexible business model that includes the appropriate sales incentive structure that will bring the momentum to the street. For the purposes of this post, I’d like to highlight the challenge that software vendors need to solve for sustainable growth in the SaaS market: maintaining and increasing customer insights.
In the latest evolution of its Linux push, IBM has added to its non-x86 Linux server line with the introduction of new dedicated Power 7 rack and blade servers that only run Linux. “Hah!” you say. “Power already runs Linux, and quite well according to IBM.” This is indeed true, but when you look at the price/performance of Linux on standard Power, the picture is not quite as advantageous, with the higher cost of Power servers compared to x86 servers offsetting much if not all of the performance advantage.
Enter the new Flex System p24L (Linux) Compute Node blade for the new PureFlex system and the IBM PowerLinuxTM 7R2 rack server. Both are dedicated Linux-only systems with 2 Power 7 6/8 core, 4 threads/core processors, and are shipped with unlimited licenses for IBM’s PowerVM hypervisor. Most importantly, these systems, in exchange for the limitation that they will run only Linux, are priced competitively with similarly configured x86 systems from major competitors, and IBM is betting on the improvement in performance, shown by IBM-supplied benchmarks, to overcome any resistance to running Linux on a non-x86 system. Note that this is a different proposition than Linux running on an IFL in a zSeries, since the mainframe is usually not the entry for the customer — IBM typically sells to customers with existing mainframe, whereas with Power Linux they will also be attempting to sell to net new customers as well as established accounts.
Over the last couple of years, IBM, despite having a rich internal technology ecosystem and a number of competitive blade and CI offerings, has not had a comprehensive integrated offering to challenge HP’s CloudSystem Matrix and Cisco’s UCS. This past week IBM effectively silenced its critics and jumped to the head of the CI queue with the announcement of two products, PureFlex and PureApplication, the results of a massive multi-year engineering investment in blade hardware, systems management, networking, and storage integration. Based on a new modular blade architecture and new management architecture, the two products are really more of a continuum of a product defined by the level of software rather than two separate technology offerings.
PureFlex is the base product, consisting of the new hardware (which despite having the same number of blades as the existing HS blade products, is in fact a totally new piece of hardware), which integrates both BNT-based networking as well as a new object-based management architecture which can manage up to four chassis and provide a powerful setoff optimization, installation, and self-diagnostic functions for the hardware and software stack up to and including the OS images and VMs. In addition IBM appears to have integrated the complete suite of Open Fabric Manager and Virtual Fabric for remapping MAC/WWN UIDs and managing VM networking connections, and storage integration via the embedded V7000 storage unit, which serves as both a storage pool and an aggregation point for virtualizing external storage. The laundry list of features and functions is too long to itemize here, but PureFlex, especially with its hypervisor-neutrality and IBM’s Cloud FastStart option, is a complete platform for an enterprise private cloud or a horizontal VM compute farm, however you choose to label a shared VM utility.
Last week I attended Telefónica’s leadership event, which is held annually in Miami, reflecting its very strong basis in the Americas. This year’s event attracted around 700 visitors from 130 countries, comprising Telefónica’s customers, vendor partners, and analysts. There were several external keynote speakers, like the CIO of the US government, futurologist Michio Kaku, and the chief economist of the Economist Intelligence Unit, that outlined the macro context for society and the economy over the coming 10 to 20 years. Presentations by partners like Huawei, Microsoft, Nokia, amdocs, and Samsung highlighted visions of the future from a vendor angle. Telefónica itself used the opportunity to present its own vision of how technological progress will affect society and business — and how it intends to address the opportunities and challenges ahead.
Telefónica stands out from its peer group of incumbent telcos by having revamped its overall organizational structure. The firm had already announced this new structure last fall; it effectively sets up one division that focuses on global internal administration and procurement (Global Resources), one division that focuses on emerging Internet-based solutions (Digital), and two geographically focused go-to-market-facing business lines (Americas and Europe). Telefónica Multinational Solutions is part of Global Resources and is the division dedicated to delivering services to the MNC segment.