SAP customers shouldn't worry about the financial hit. SAP can pay the damages without having to rein back R&D. The pain may also stimulate it to greater competition with Oracle, both commercially and technologically, which will be beneficial for IT buyers.
Was the award fair? Well, IANAL, so I can't answer that. But my question is, if the basis of the award was "if you take something from someone and you use it, you have to pay", as the juror said, does that mean SAP gets to keep the licenses for which the court is forcing it to pay?
The $1.3 billion verdict in the Oracle v. SAP case is surprising, given that the third-party support subsidiary of SAP, TomorrowNow, was fixing glitches and making compliance updates, not trying to resell the software. The jury felt that the appropriate damage award was based on the fair market value of the software that was illegally downloaded, rather than Oracle’s lost revenues for support.
A news article by Bloomberg provides further insight into the jury’s thinking and the legal process. Quoting juror Joe Bangay, an auto body technician: “If you take something from someone and you use it, you have to pay.” Perhaps SAP should have made its case more in layman’s terms.
SAP is in a very difficult position, in that it faces the same threat of revenue loss from third-party support. It was unable to convincingly defend its entry into the third-party support business for fear of legitimizing a business that poses a similar threat to its lucrative maintenance business as to Oracle’s.
What happens to the third-party support business going forward? The size of the award potentially dampens customer interest in moving to third-party support, particularly with another case pending of Oracle v. Rimini Street. The SAP case, however, does not invalidate third-party support as a business. Third-party support, if carried out properly, offers an important option for enterprise application customers that are looking for relief from costly vendor maintenance contracts.
For SAP, the verdict is not only painful, but it prolongs the agony, because it is compelled to appeal the verdict. SAP certainly has the financial wherewithal to pay the damages but was hoping to put this embarrassing debacle behind them.
I have been working on a research document, to be published this quarter, on the impact of 8-socket x86 servers based on Intel’s new Xeon 7500 CPU. In a nutshell, these systems have the performance of the best-of-breed RISC/UNIX systems of three years ago, at a substantially better price, and their overall performance improvement trajectory has been steeper than competing technologies for the past decade.
This is probably not shocking news and is not the subject of this current post, although I would encourage you to read it when it is finally published. During the course of researching this document I spent time trying to prove or disprove my thesis that x86 system performance solidly overlapped that of RISC/UNIX with available benchmark results. The process highlighted for me the limitations of using standardized benchmarks for performance comparisons. There are now so many benchmarks available that system vendors are only performing each benchmark on selected subsets of their product lines, if at all. Additionally, most benchmarks suffer from several common flaws:
They are results from high-end configurations, in many cases far beyond the norm for any normal use cases, but results cannot be interpolated to smaller, more realistic configurations.
They are often the result of teams of very smart experts tuning the system configurations, application and system software parameters for optimal results. For a large benchmark such as SAP or TPC, it is probably reasonable to assume that there are over 1,000 variables involved in the tuning effort. This makes the results very much like EPA mileage figures — the consumer is guaranteed not to exceed these numbers.
With about 41,000 attendees, 1,800 sessions, and a whooping 63,000-plus slides, Oracle OpenWorld 2010 (September 19-23) in San Francisco was certainly a mega event with more information than one could possibly digest or even collect in a week. While the main takeaway for every attendee depends, of course, on the individual’s area of interest, there was a strong focus this year on hardware due to the Sun Microsystems acquisition. I’m a strong believer in the integration story of “Hardware and Software. Engineered to Work Together.” and really liked the Iron Man 2 show-off all around the event; but, because I’m an application guy, the biggest part of the story, including the launch of Oracle Exalogic Elastic Cloud, was a bit lost on me. And the fact that Larry Ellison basically repeated the same story in his two keynotes didn’t really resonate with me — until he came to what I was most interested in: Oracle Fusion Applications!
Software AG announced today a significant change in their executive structure. After the acquisition of webMethods back in 2007, the second largest software vendor in Germany acquired IDS Scheer last year, at topic we explored already in this report.
If you follow Software AG over this time, you might realize that the way CEO Karl-Heinz Streibich runs a post merger process may involve dramatic disruptions in the executive structure of the company. Dave Mitchell, the former webMethods CEO left some months after that acquisition. Today, the Chief Product Officer, Dr. Peter Kürpick surprisingly left the company. Peter was a member of the executive board since 2005, and, although his contract officially runs until 2013, he is leaving at his own request immediately. He stood for the successful turnaround of Software AG’s product strategy and repositioned Software AG from an outmoded mainframe shop into a leading global integration player. The successful merging of Software AG’s mainframe and integration know-how with the newer webMethods product stack into one interoperable integration stack was one of Peter’s major achievements. Peter also took over the responsibility for Software AG’s ETS (mainframe) product strategy after the integration business reached a solid stability. He would have had the skills and experience to create a consistent technology stack spanning from the mainframe over the WebMethods integration up to the business architecture tools of IDS Scheer (ARIS).
To paraphrase Charles Dickens, Q2 2010 seemed like the best of times or the worst of times for the big software vendors. For Microsoft, it was the best of times; for IBM, it was (comparatively) the worst of times; and for SAP it was in between. IBM on June 19, 2010, reported total revenue growth of just 2% in the fiscal quarter ending June 30, 2010, with its software unit also reporting 2% growth (6%, excluding the revenues of its divested product lifecycle management group from Q2 2009). Those growth rates were down from 5% growth for IBM overall in Q1 2010, and 11% for the software group. In comparison, Microsoft on June 22, 2010, reported 22% growth in its revenues, with Windows revenues up 44%, Server and Tools revenues up 14%, and Microsoft Business Division (Office and Dynamics) up 15%. And SAP on June 27, 2010, posted 12% growth in its revenues in euros, 5% growth on a constant currency basis, and 5% growth when its revenues were converted into dollars.
What do these divergent results for revenue growth say about the state of the enterprise software market?
I joined an impressively large crowd at SAP’s World Tour event in Birmingham,UK, last week and was able to spend an hour with Tim Noble, head of SAP’s UK and Ireland business unit, and Chris McLain, who leads SAP’s team focusing on its 150 largest accounts in EMEA. I'm writing an update of my 2007 report "Effective SAP Pricing And Licensing Negotiation" and wanted to know what they thought about the clash between traditional deal-based sales incentives and Forrester’s clients’ need for commercial flexibility and more recognition, by their key software providers, of the wider relationship. It’s a topic I’ve raised before (http://blogs.forrester.com/duncan_jones/10-03-19-open_letter_season_sap), and I was very pleased to hear some things that SAP is doing to reduce this conflict.
I explained why, from my research, software vendors’ insatiable craving for recognizable license revenue at the expense of creating shared incentives for success is damaging to customers and to the vendor. Both Tim and Chris clearly understand the problem. Tim keeps reps on the same accounts for several years and rewards them for metrics such as customer satisfaction to avoid the revolving door sell-and-run approach that characterized software selling before the advent of SaaS. Chris has a team of Global Account Directors that works with local sales, pre-sales, and delivery teams to provide the holistic view that Forrester clients want and struggle to get from SAP’s competitors.
We just published a new report entitled "The Evolution Of Cloud Computing Markets". It recaps many of the cloud computing market observations from the last two years and categorizes the business models in a consistent taxonomy. Basically all current offerings from pure Infrastructure as a Service, in the upper left, via virtualization tools up to SaaS applications can be categorized by this. We explain the key characteristics of each business model and give vendors guidance to position and communicate their cloud service.
Beyond the preview on this blog, the full document predicts the future market momentum around:
Informatica is one of the traditional leaders when it comes to data quality and data integration. More than 4,000 customers trust Informatica's software products globally and drive more than half a billion dollars in revenue. Informatica solves many of the traditional data integration challenges, for example, between custom developed apps and packaged ERP solutions. As a result, IT operations professionals and enterprise architects are well aware of Informatica’s solutions. However, what has gone under the radar so far is Informatica's cloud computing approach. For about two years now, Informatica has provided www.informaticacloud.com, a cloud-based integration offering, for customers. Informatica recently announced a new version of this service, and Forrester had the chance to talk to the vendor prior to the launch. The new solution offers an improved service for data quality, B2B data transformations, and a number of continuous improvements. But what really caught my attention is Informatica's well-kept secret of a sophisticated agent technology.
Back-office managers and European customers have ignored the message — until now
Some days ago at Forrester’s IT Forum in Lisbon (June 9-11) I gave a presentation together with my colleague Andy Bartels on the IT market recovery (we predict a 9.3% IT market growth in 2010) after two economically challenging years in 2008/9. In fact, we were making the point that the market rebound we currently see is not simply a recovery but the beginning of a new IT hyper growth phase fueled by a new wave of innovation.
A strong driver of this innovation is what we call Smart Computing at Forrester: the integration of physical world information into intelligent IT-supported business processes in 4 steps: Awareness (via new sensor technology), Analysis (with advanced BI solutions), Alternatives (including rules and process engines) and Action (in industry business applications), plus a 5th feedback loop of Auditability for tracking and learning.
A well-known example of smart computing solutions is smart metering in the Utilities industry. In another presentation in Lisbon, a colleague asked the audience, a room full with all the leading IT service companies, who all had an initiative running with smart metering – everyone in the room raised their hands. Then he asked who actually had more than 1-3 (pilot) projects running – and almost no one raised their hand.
Is smart metering just hype that everyone is jumping on or what is the reality of the lighthouse example of smart computing at this point in time?