The SAP services market is undergoing significant change: provider consolidation, changes in pricing models, new delivery options, and cloud-based deployment. At the same time, firms are entering 2010 with an eye to growth and business strategy enablement, after significant focus on cost-cutting during the recession. Firms struggle with finding the best services provider for their SAP project and the best delivery, pricing, and deployment models to ensure value, ROI, and success in achieving business goals,. Increasingly, firms are also considering Cloud and SaaS delivery models.
SAP users wondering about the latest trends in SAP services – from pricing models to multi-sourcing to cloud – are welcome to join us for an interactive session next Thursday March 25th. Moderated by Forrester’s George Lawrie, Bill Martorelli, Euan Davis, Stefan Ried and I will lead an interactive discussion around:
- SAP services provider landscape. The market has undergone significant consolidation, with major acquisitions by firms like PwC (BearingPoint), Xerox(ACS), and Dell(Perot) as well as numerous smaller acquisitions. Leading India-based firms have rapidly built their strategy consulting capabilities and now challenge the MNCs in higher value project work.
- Offshore delivery. Offshore ratios have grown extremely high. Implementation and project work is commonly 60% or more offshore; support and maintenance work surpasses 90%. Firms’ offshore strategy is broadening beyond India into geographies such as Latin America, China, and Philippines.
- Outsourcing and AMS work. Firms weigh the trade-offs between single-sourcing their project across implementation, AMS, and hosting versus using multiple providers. Firms also struggle with pricing models and SLAs, with many firms exploring outcome-based pricing models that shift risk to their provide. Outcome-based pricing also provides a potential foundation for innovation and savings beyond labor arbitrage.
Fast Access To Data Is The Primary Purpose Of Caching
Developers have always used data caching to improve application performance. (CPU registers are data caches!) The closer the data is to the application code, the faster the application will run because you avoid the access latency caused by disk and/or network. Local caching is fastest because you cache the data in the same memory as the code itself. Need to render a drop-down list faster? Read the list from the database once, and then cache it in a Java HashMap. Need to avoid the performance-sapping disk trashing of an SQL call to repeatedly render a personalized user’s Web page? Cache the user profile and the rendered page fragments in the user session.
Although local caching is fine for Web applications that run on one or two application servers, it is insufficient if any or all of the following conditions apply:
The data is too big to fit in the application server memory space.
Cached data is updated and shared by users across multiple application servers.
User requests, and therefore user sessions, are not bound to a particular application server.
Failover is required without data loss.
To overcome these scaling challenges, application architects often give up on caching and instead turn to the clustering features provided by relational database management systems (RDBMSes). The problem: It is often at the expense of performance and can be very costly to scale up. So, how can firms get improved performance along with scale and fault tolerance?
Elastic Caching Platforms Balance Performance With Scalability And Availability
In my recent interviews with IT services providers on the topic of innovation, one of the key findings was the many different ways in which innovation can be categorized. Some companies view innovation as simply an extension of their traditional R&D capabilities, others view their innovation as a way to prove their thought leadership, still others view innovation largely as a strategic marketing imperative. Sometimes, it’s a combination of these factors.
One interview that stood out was with Lem Lasher, the Chief Innovation Officer (and Global Business Services President) at CSC, who described to me a deep and holistic approach to transforming CSC’s innovation capabilities. Three things that stood out at me about Lem’s approach:
With Forrester’s new blogging platform in place, I have the opportunity to launch a series of blogs about tech economics. What do I mean by tech economics? To me, tech economics first means how the larger economy and the tech sector interact. I am interested both in how economic conditions impact the demand for technology goods and services and how business and government purchases of these tech goods and services affect the economy as a whole and the industries and firms in the economy. Second, tech economics is about the revenue of tech vendors, both what they are reporting in the present and past and what we expect those revenues will be based on future purchases by their business and government customers.
My published research on the US and global IT market outlook, industry, regional, and country IT purchase trends, big trends like Smart Computing, and the ePurchasing software market (which I also cover) will continue to be my platform for addressing tech economics. However, I want to use this blog to talk about four focused aspects of the tech market: 1) tech data sources; 2) tech industry definitions; 3) tech market developments; and 4) tech market dynamics. Let’s call these the 4Ds of tech economics, and each will have its own strand of comments and observations.
D1: Tech data sources will be of most use to the data geeks like me in tech vendors. These are folks who use my numbers in their own forecasts of the market for their firm and its products. These blogs will talk about the data sources that I use in building my tech market sizing and forecasts, issues and questions about these data sources, and how the data geeks can leverage them. I will share some (but not all!) of our secret sauce for our forecasts, and I hope you will share some of yours so we can all get better.
Self-service analytics is one of my core coverage focus areas. It applies not just to business intelligence (BI) but also to advanced analytics.
When, a few months ago, I uttered the immortal phrase “roll over rocket scientists,” I was referring more specifically to the need for pervasive self-service tools for predictive analytics and data mining (PA/DM). Considering that my recently published Forrester Wave on PA/DM Solutions primarily addressed the traditional requirements of “rocket scientist” experts in statistical analysis, I did not put a huge emphasis on data mining features geared to business analysts, subject matter experts, and other “non-technical” information workers.
As I’ve stated in that blogpost and the follow-on podcast, the core problem with today’s PA/DM offerings is that many of them are power tools, not solutions that have been designed for the mass business market. Vendors such as SAS Institute, IBM/SPSS, KXEN, Oracle, Portrait Software, Angoss, FICO, and TIBCO Spotfire provide data mining specialists with feature-rich algorithm-powered solutions for modeling, scoring, regression, and other core PA/DM functions. Their core, traditional user base consists of statisticians, mathematicians, and other highly educated analytics professionals.
Sikka made two comments that indicate how he's thinking about the NetWeaver portfolio.
1. In response to my question about whether SAP is concerned that Oracle's ownership of Java will put it at a disadvantage, Sikka started by highlighting SAP's work on Java performance, but then noted the availability of good open-source Java software to support the requirements of SAP customers.
Social networks have their foundations in the space-time continuum—you know, the funky coordinate system that Einstein was so keen about.
Social network analysis is all about looking for patterns of “proximity” among people, considered in their cultural capacities as influencers and followers, innovators and imitators, first-movers and late adopters. Down deep, I consider social network analysis an important new branch of decision support systems as a discipline. The core question is: What unique situational chemistry causes various people, individually or collectively, to make various decisions at various places and times?
That’s where space and time enter the social network analysis equation. It’s not enough that I look up to your shining example and take my lead from what you say and do. It’s just as important that we be in the same city, neighborhood, or room. More than that, it’s important that you and I actually cross paths in order for you to actively influence me to buy that latte, or for you to calm me down and thereby stop me from storming out the door and severing my relationship with a retailer who has ignored my complaints one time too many.
For the past couple of years, I have worked on the analysis of global banking platform deals at this time of the year. Currently, I’m again working on the results of a global banking platform deals survey, this time for the year 2009. Accenture and CSC did not participate in 2009, and former participants Fiserv and InfrasoftTech continued their absence from the survey, which started about two years ago. The 2009 survey began with confirmed submissions from a total of 19 banking platform vendors.
We would have been glad to see more participating vendors, in particular some of the more regionally oriented ones. However, US vendor Jack Henry & Associates as well as multiple regional vendors in Eastern Europe, Asia, and South America did not participate. Nevertheless, the survey saw some “newcomers” from the Americas, Europe, and the Middle East, for example, Top Systems in Uruguay, Eri Bancaire in Switzerland, and Path Solutions in Kuwait. Consequently, the survey now covers banking platform vendors in all regions of the world except Africa and Central America.
However, 19 was not the final vendor count: One of the 19 vendors, France-based banking platform vendor Viveo, dropped out of the survey because Temenos acquired it shortly before Viveo provided its data. Another vendor simply told us that it only saw business with existing clients and, in the absence of any business with new clients, it saw no sense in participating. While all other participating vendors won business with new clients (whether the rules of the game allowed Forrester to count that business or not), 2009 was not the best of times.
What the business world needs now is a bigger, badder, more powerful social media dashboard for customer relationship management (CRM). It almost goes without saying that TweetDeck just won’t cut it.
Ideally, the social media dashboard would provide a CRM-integrated interface for monitoring what customers are saying about you in Twitter, Facebook, and other communities. It would also allow you to aggregate high-level customer satisfaction metrics; to flag smouldering issues surrounding defective products and poor customer service; to respond inline through these channels; and to escalate issues internally to the appropriate parties. In other words, it would be, per my colleagues Bill Band and Natalie Petouhoff, a true “customer business intelligence (BI)” dashboard.
As you develop your company’s social CRM strategy, you must provide social media dashboards to all roles that participate in the customer lifecycle. Whether you’re a brand manager who simply wants to listen into social networks to track awareness, sentiment, and propensities, or a sales person who is interested in identifying and qualifying leads, or a customer service rep who wants to interact closely with established customers, a social media dashboard will soon become a core productivity tool.