As I close out my client inquiry records for the quarter, it’s interesting to review some of the common challenges risk management professionals are currently facing. I was impressed to see how closely the issues I deal with were covered in the month’s edition of Risk Management Magazine. In an article entitled, “10 Common ERM Challenges,” KPMG’s Jim Negus called out the following issues:
Assessing ERM’s value
Privilege (of access to risk information)
(Selecting a) risk assessment method
Qualitative versus quantitative (assessment metrics)
Time horizon (for risk assessments)
Multiple possible scenarios
Simulations and stress tests
Negus provides good perspective on these challenges as well as some ideas for solutions. The list is fairly comprehensive, but there are several other challenges that I would have included based on the inquiries I get. First and foremost, the role of technology in risk management – whether for assessments, aggregation, or analytics – comes up very frequently, and vendor selection initiatives have been plentiful since mid-Q4 of last year.
Defining risk management’s role within the business (and vice versa) is also an extremely common topic of conversation. As rules and standards keep changing, this will remain a top challenge. Other frequent issues include event/loss management, building a risk taxonomy, and evaluating vendor/partner risk.
For those of you unable to attend, I will summarize some of the content that I presented on SAP’s overall growth and innovation strategy. SAP has a double-barreled product strategy focused on Growth and Innovation.
The Growth strategy rests heavily on the current Business Suite, which includes the core ERP product that is used by approximately 30,000 companies worldwide. SAP claims that it touches 60 percent of the world’s business transactions, which is hard to validate but not all that hard to believe. The main revenue source today is Support, which comprises 50% of the total revenues of the company at more than 5 billion Euros annually, and it grew by 15% in 2009. Other growth engines include:
Over the weekend I took my daughter to see "Alice in Wonderland" and couldn't resist comparing Johnny Depp's Mad Hatter character to Pega's recent move to acquire Chordiant. For those of you who haven't seen the movie, it's not as weird as the usual Tim Burton movie; but the Mad Hatter character is a little disturbing, with his rhymes and riddles that keep you guessing at his true meaning.
For many process professionals, Pega's recent move was just as confusing as having a conversation with the Mad Hatter. What exactly is he trying to say anyway?
A number of clients ask me "how many people do you think use BI". Not an easy question to answer, will not be an exact science, and will have many caveats. But here we go:
First, let's assume that we are only talking about what we all consider "traditional BI" apps. Let's exclude home grown apps built using spreadsheets and desktop databases. Let's also exclude operational reporting apps that are embedded in ERP, CRM and other applications.
Then, let's cut out everyone who only gets the results of a BI report/analysis in a static form, such as a hardcopy or a non interactive PDF file. So if you're not creating, modifying, viewing via a portal, sorting, filtering, ranking, drilling, etc, you probably do not require a BI product license and I am not counting you.
I'll just attempt to do this for the US for now. If the approach works, we'll try it for other major regions and countries.
Number of businesses with over 100 employees (a reasonable cut off for a business size that would consider using what we define as traditional BI) in the US in 2004 was 107,119
US Dept of Labor provides ranges as in "firms with 500-749 employees". For each range I take a middle number. For the last range "firms with over 10,000" I use an average of 15,000 employees.
This gives us 66 million (66,595,553) workers employed by US firms who could potentially use BI
Next we take the data from our latest BDS numbers on BI which tell us that 54% of the firms are using BI which gives us 35 million (35,961,598) workers employed by US firms that use BI
Inside the BPA Group at Forrester, we conducted a little experiment. I suggested that we should collaborate on a piece about the Pega acquisition of Chordiant. What followed was a large number of email exchanges. I drew the short straw in bringing all these thoughts together into a coherent whole. I prepared a document for Forrester clients to explore the acquisition in detail (probably getting through the editing process next week some time), and this blog post is culled from that document. So while the blog post bears my name, it reflects the collective opinions of Connie Moore, Bill Band, Natalie Petouhoff, John Rymer, Clay Richardson, Craig Le Claire and James Kobielus. Of course, I have put my own interpretation on it too.
Pega definitely wants to be in the customer experience/customer service business, and they want to get there by having a very strong BPM offering. It is not that they are moving away from BPM in favor of Customer Experience – they’re just strengthening their hand in CRM (or CPM as they would call it), more forcefully making the connection. We already knew this, but the Chordiant deal just reinforced that point (see related research doc from Bill Band in 2005 !!). This is not a new direction or change in direction for Pega, it is a strong move that takes them faster in the direction they were already going.
From a product point of view, Pega are adding/strengthening their hand – Choridant’s marketing automation and predictive analytics seem to be of greatest interest. Of course, Pega also values the engineering talent that Chordiant has, and will redirect those people over time to work on integrating these capabilities into the BPM offering. They were also interested in the vertical industry and functional expertise that Chordiant had to offer.
CIO job tenure is now averaging 4.6 years, according to the Society for Information Management. That’s up – way up -- from the 2-3 year average that we saw just a few years ago. How do you explain the lengthening time in job? Is it just because CIOs are better at their jobs than CEOs or CFOs who have higher churn rates? Probably not.
My guess is that the post dot-com bust and post 9/11 recession triggered CIOs to hunker down and be a bit more risk-adverse. They stayed put for a few years, then facing the more recent economic slump, stayed put even longer. They stayed busy doing what they unfortunately are known for -- helping with enterprise cost cutting. More reactive, more cost conscious, and less innovative CIOs are less likely to take risks and less likely to be fired for risk-taking.
But I suspect the trend towards longer tenure is rapidly coming to an end. The CIOs I speak with are eagerly waking up to tackle innovation and new investments in 2010. And we’re seeing more and more ex-consultant hot shots and business execs from elsewhere in the company recently hired on to “fix IT” join the CIO ranks. More proactive, innovative, and impactful CIOs are more likely to follow ambitious career paths – or (if your a glass-half-empty kind of guy or girl) get fired for risk-taking.
Microsoft announced on Friday that it will stop selling new Select licenses from 1 July, 2011. Customers with existing agreements can renew them for another 36 months, as per their agreements, but the replacement Select Plus program is likely to be a better option. Microsoft launched Select Plus on 1 July 2008, and I wrote at the time that it was an improvement on the basic Select structure: Microsoft Simplifies Its Volume Licensing.
However, Microsoft's pricing team struggled to persuade its LARs to promote Select Plus over the more familiar Select agreement, and customer adoption was disappointing. So the decision to drop the older program makes sense for Microsoft, because it will force its channel partners to embrace the new model. And its no bad thing for buyers - you've one less choice to make, and there's little negative impact.
The biggest advantage of Select Plus for sourcing managers is that they no longer need to submit a three-year spending forecast - this is extremely difficult for central teams buying on behalf of autonomous business units that won't havent planned Microsoft technology adoption that far out. Instead, pricing works like an airline loyalty program, on the current and previous years' actual transactions, as the figure below from my report illustrates. My report explains some more advantages, such as the flexibility to opt tactically for software assurance on individual purchases.
Hopefully you’ve all read SAP’s co-CEO’s open letter to you (http://ceos.blogs-sap.com), and also some of the great responses such as this one: http://bit.ly/b5foPD . With all these open letters flying around, I thought I’d write a slightly different one. Unlike most of my fellow commentators, I’m not going to tell SAP how to run its business. Instead, I’m going to give you, its customers, a suggestion on how you can cut the cost of your SAP environment. You ready? The answer is “buy less stuff from them”.
Actually, it is not as facile as it sounds. Many companies that I speak with automatically favour their incumbent vendors for new projects, while their IT vendor managers complain to me about their negotiation impotence. You won’t be able to get the contractual protection you need, such as limits on CPI maintenance increases, unless you make them a condition of future purchases. Large software companies such as IBM, Oracle and SAP focus predominantly on license sales. It wasn’t customers’ unhappiness, resulting from the Enterprise Support blunder, that caused SAP to fire its CEO and rethink its approach. It was the fact that you showed that unhappiness by voting with your purchase orders, delaying projects, going to competing vendors, and causing SAP’s license revenue to plummet. When Jim and Bill promise to “accelerate the pace of the innovation we deliver to you”, the d word is a euphemism for ‘sell’.
The SAP services market is undergoing significant change: provider consolidation, changes in pricing models, new delivery options, and cloud-based deployment. At the same time, firms are entering 2010 with an eye to growth and business strategy enablement, after significant focus on cost-cutting during the recession. Firms struggle with finding the best services provider for their SAP project and the best delivery, pricing, and deployment models to ensure value, ROI, and success in achieving business goals,. Increasingly, firms are also considering Cloud and SaaS delivery models.
SAP users wondering about the latest trends in SAP services – from pricing models to multi-sourcing to cloud – are welcome to join us for an interactive session next Thursday March 25th. Moderated by Forrester’s George Lawrie, Bill Martorelli, Euan Davis, Stefan Ried and I will lead an interactive discussion around:
- SAP services provider landscape. The market has undergone significant consolidation, with major acquisitions by firms like PwC (BearingPoint), Xerox(ACS), and Dell(Perot) as well as numerous smaller acquisitions. Leading India-based firms have rapidly built their strategy consulting capabilities and now challenge the MNCs in higher value project work.
- Offshore delivery. Offshore ratios have grown extremely high. Implementation and project work is commonly 60% or more offshore; support and maintenance work surpasses 90%. Firms’ offshore strategy is broadening beyond India into geographies such as Latin America, China, and Philippines.
- Outsourcing and AMS work. Firms weigh the trade-offs between single-sourcing their project across implementation, AMS, and hosting versus using multiple providers. Firms also struggle with pricing models and SLAs, with many firms exploring outcome-based pricing models that shift risk to their provide. Outcome-based pricing also provides a potential foundation for innovation and savings beyond labor arbitrage.
Fast Access To Data Is The Primary Purpose Of Caching
Developers have always used data caching to improve application performance. (CPU registers are data caches!) The closer the data is to the application code, the faster the application will run because you avoid the access latency caused by disk and/or network. Local caching is fastest because you cache the data in the same memory as the code itself. Need to render a drop-down list faster? Read the list from the database once, and then cache it in a Java HashMap. Need to avoid the performance-sapping disk trashing of an SQL call to repeatedly render a personalized user’s Web page? Cache the user profile and the rendered page fragments in the user session.
Although local caching is fine for Web applications that run on one or two application servers, it is insufficient if any or all of the following conditions apply:
The data is too big to fit in the application server memory space.
Cached data is updated and shared by users across multiple application servers.
User requests, and therefore user sessions, are not bound to a particular application server.
Failover is required without data loss.
To overcome these scaling challenges, application architects often give up on caching and instead turn to the clustering features provided by relational database management systems (RDBMSes). The problem: It is often at the expense of performance and can be very costly to scale up. So, how can firms get improved performance along with scale and fault tolerance?
Elastic Caching Platforms Balance Performance With Scalability And Availability