Software AG announced today a significant change in their executive structure. After the acquisition of webMethods back in 2007, the second largest software vendor in Germany acquired IDS Scheer last year, at topic we explored already in this report.
If you follow Software AG over this time, you might realize that the way CEO Karl-Heinz Streibich runs a post merger process may involve dramatic disruptions in the executive structure of the company. Dave Mitchell, the former webMethods CEO left some months after that acquisition. Today, the Chief Product Officer, Dr. Peter Kürpick surprisingly left the company. Peter was a member of the executive board since 2005, and, although his contract officially runs until 2013, he is leaving at his own request immediately. He stood for the successful turnaround of Software AG’s product strategy and repositioned Software AG from an outmoded mainframe shop into a leading global integration player. The successful merging of Software AG’s mainframe and integration know-how with the newer webMethods product stack into one interoperable integration stack was one of Peter’s major achievements. Peter also took over the responsibility for Software AG’s ETS (mainframe) product strategy after the integration business reached a solid stability. He would have had the skills and experience to create a consistent technology stack spanning from the mainframe over the WebMethods integration up to the business architecture tools of IDS Scheer (ARIS).
I’ve just returned from ShoreTel’s Partner Conference in San Diego, and while the weather was uncharacteristically gray, the executives were exceedingly bright. ShoreTel continues to capitalize on its SMB momentum with its “Brilliant Simplicity” tagline, emphasizing the ease of deployment of its solution for IT administrators, users, and buyers alike. ShoreTel executives were exuberant about their five straight quarters of revenue growth and committed to investing heavily in R&D and sales, highlighting current products (including the announcement of ShoreTel 11) and future directions for the company. Here are several significant items that ShoreTel stressed:
ShoreTel for IBM Lotus Foundations. Already available on the market for four months, this self-healing UCC appliance is easy to deploy, configure, and maintain. IBM has had problems developing market momentum with other partners — Mitel, NEC, and Nortel — but current CEO John Combs stressed that the value of the solution combined with the strength of ShoreTel’s partners would set it apart. I believe this appliance will be a winner.
ShoreTel Virtualization. Ed Basart, chief technology officer spoke about future ShoreTel deployments having the ability to be centralized or distributed depending on the customer’s unique communication patterns and needs as the software for server and switch components is ported to run on VMware. I think ShoreTel will do well to capitalize on the market interest in virtualization, and that capability will provide a calling card for ShoreTel at potential enterprise accounts as it continues to increase the potential scale of its solution.
What is the opportunity for Microsoft partners (or other VARs, SIs, ISVs and technologists) in the emerging cloud computing space? Don't think of cloud as a threat but as an opportunity to ratchet up your value to the business my evangelizing and encouraging their transition to the cloud. How? At the recent Microsoft Worldwide Partner Conference I addressed this issue in an Expo Theater presentation. Missed it? Now you haven't:
NTT is set to buy Dimension Data (DiData) for US$3.2 billion. For decades, customers have lamented their traditional telco service providers’ lack of IT integration depth — today, NTT appears to be putting its money where its customers want. Following in the footsteps of more focused deals like BT’s acquisition of Wire One or AT&T’s acquisition of VeriSign’s Global Security Consulting Business, the acquisition of Dimension Data signals NTT’s intent to be a superpower in worldwide information and communications technology (ICT) solutions delivery. But, make no mistake, it is still only a small acquisition for NTT — as one of only three telcos in the world with more than US$100 billion in revenues, the US$3.2 billion acquisition price will have only incremental effect on the firm’s balance sheet.
Other than the right to say NTT owns a highly respected global ICT integrator, what’s in the deal for NTT?
Many cloud computing services in the consumer space are per se for free. Even sophisticated platform-as-a-service (PaaS) environments are coming from most vendors with a free sandbox environment and start charging finally the productive use. The obvious question I hear from many vendors today is how to monetize platforms and applications in the cloud. The situation for established ISVs of business applications can be even worse: The cloud might significantly cannibalize existing license revenue streams. Thus a transformation of existing business models and vendor strategy is anything but easy.
Addressing this challenge, I'd like to point you to a Forrester workshop “Selling The Cloud” on 30th September in London.
The workshop will focus on a evaluating your “cloud readiness” and consequently help develop your cloud strategy through the use of a self assessment tool. This is a great opportunity to learn an effective method for improving the business results of any migration to a cloud-based service. You can actually predict which, if any, of your products will be successful in a cloud deployment.
The workshop will be hosted by Stefan Ried, Senior Analyst at Forrester and in case you’re interested, here’s a Web page with an agenda: View Workshop Details.
You can register right on the site or, if you’d like more information, you can contact an Event Sales Representative at +1 888/343-6786 or email@example.com
You can also simply leave a comment to this blog, asking any question to the event agenda and value.
We just published a new report entitled "The Evolution Of Cloud Computing Markets". It recaps many of the cloud computing market observations from the last two years and categorizes the business models in a consistent taxonomy. Basically all current offerings from pure Infrastructure as a Service, in the upper left, via virtualization tools up to SaaS applications can be categorized by this. We explain the key characteristics of each business model and give vendors guidance to position and communicate their cloud service.
Beyond the preview on this blog, the full document predicts the future market momentum around:
Informatica is one of the traditional leaders when it comes to data quality and data integration. More than 4,000 customers trust Informatica's software products globally and drive more than half a billion dollars in revenue. Informatica solves many of the traditional data integration challenges, for example, between custom developed apps and packaged ERP solutions. As a result, IT operations professionals and enterprise architects are well aware of Informatica’s solutions. However, what has gone under the radar so far is Informatica's cloud computing approach. For about two years now, Informatica has provided www.informaticacloud.com, a cloud-based integration offering, for customers. Informatica recently announced a new version of this service, and Forrester had the chance to talk to the vendor prior to the launch. The new solution offers an improved service for data quality, B2B data transformations, and a number of continuous improvements. But what really caught my attention is Informatica's well-kept secret of a sophisticated agent technology.
Back-office managers and European customers have ignored the message — until now
I am starting to see signs of important changes in technology and IT organizations. The increased complexity of IT and business services forces the industry down a new path. In this context, there are signs reminiscent of what happened to the mainframe vendors in the late 80s and early 90s, when the transition from proprietary to open systems was usually not very successful. In fact, the major players of today (with the exception of IBM) were small potatoes in the 80s, while the major players of that time are either gone or dying. And some vendors today seem to be following the same recipe for eventual disaster.
What’s happening, in the case of a major change of market direction in a company with revenue based on old technology, is what I would call a “sales force failure.” This is the inability of the sales force to get out of its base of usual customers and compete head to head with new vendors in the new market.
Usually these organizations are technically capable of building up-to-date products, but the sales results often don’t meet expectations. Since the new product created internally does not sell, the company management may be tempted to fix the problem (i.e., satisfy the shareholders in the short term) by cutting the cost center, that is the engineering organization making this new product. With R&D gone, the marketing group may license another product to replace the one that it killed. Of course, the margins are not the same, but the cost is almost nonexistent. Eventually, this product does not sell either (the sales force is still in the same condition), and, when the old legacy products are finally dead, the company is no more than a value-added reseller.
One of the great revolutions in manufacturing of the past decades is just-in-time inventory management. The basic idea is to provision only what is needed for a certain level of operation and to put in place a number of management functions that will trigger the provisioning of inventory. This is one the key elements that allowed the manufacturing of goods to contain production costs. We have been trying to adapt the concept to IT for years with little success. But a combination of the latest technologies is finally bringing the concept to a working level. IT operations often faces unpredictable workloads or large variations of workloads during peak periods. Typically, the solution is to over-provision infrastructure capacity and use a number of potential corrective measures: load balancing, traffic shaping, fast reconfiguration and provisioning of servers, etc.
While it may have taken humans thousands of years to progress from oral to written to audio and then to video communications, in the past five years, the Internet has accelerated at a breakneck pace through all of these different communication transmission stages. It started as a way to post and communicate text and still pictures, then moved to digital voice and music, and then took a giant step to video delivery, bringing you news, sports, movies, whenever and wherever you wanted to view them. The Internet is now the prime platform for distributing video content, effectively replacing your video store and your cable or broadcast distribution.