At the end of 2012, Forrester and the ITAM Review, an IT asset management community site, ran a software asset management (SAM) survey to help understand where SAM is going in 2013. The resulting infographic* and commentary is available to Forrester clients here. For non- (hopefully future-) clients I’ve extracted some content to create this blog.
The focus and drivers for SAM have changed
Since the early 2000s, risk-focused IT professionals have voiced their concern over software compliance and the potential for vendor audits, large financial fines, damage to corporate reputation, and even the imprisonment of company directors. But these concerns weren't necessarily shared by the rest of the organization, which also viewed the SAM technology available as too difficult and complex to justify. As a result, SAM was a low priority on the IT management to-do list.
But this is starting to change as IT organizations realize that their software estates and procurement and provisioning processes are in a state of under-management, if not mismanagement. As a result, these organizations are wasting a significant amount of their IT funding each year on license procurement when they don't need to, maintenance agreement costs for more licenses than they actually use, and supporting and hosting software that should have been decommissioned.
A little while back Martin Thompson at the ITSM Review wrote an interesting blog on the complexity of IT service management (ITSM) tool pricing: http://www.theitsmreview.com/2011/09/ouch-o-meter/…I particularly liked his term "ouch-o-meter." It’s well worth a read.
It's something that has continued to puzzle me – what it would cost to buy AND implement an ITSM tool, PLUS any process or people-based change via professional services or third-party consultancy? Oops, I nearly forgot support and maintenance there too. To make matters worse, this is potentially an unknown and unbudgeted for cost that appears every 5-7years due to tool churn if us analyst types are to be believed (I have an outstanding action to include ITSM tool churn-related questions in a survey). But we need to park the churn issue for now and focus on cost or, more specifically, pricing models.
What did an ITSM tool cost in 2008? Or how long is a piece of string?
I cast my mind back to when I started as an industry analyst in 2008 and the complexity of not only which tools/applications, modules, or features needed to be costed-in but also the 30-50% "surcharge" for the professional services and 20-22% for support and maintenance. Then of course we needed to apply volume-based discounts and maybe something else based on the "customer-logo-appeal," the customer’s sourcing and vendor management strength/capabilities, and/or the sales person's need to hit quota at that point in time. I've probably oversimplified this too, feel free to educate me.
I help hundreds of technology buyers each year to understand the impact of technology changes on their software contracts, but I also get questions from software providers about how best to price their products. Some are bringing new products to market and want to know how to maximize revenue, while others are struggling with obsolete metrics such as per processor and want to update their pricing for the modern mobile, cloudy world. The answer is usually to find licensing metrics that make their pricing value-based while balancing simplicity and fairness. The more value a customer gets from your product, the more they should be willing to pay for it. If you make your pricing too simple then you won't match value sufficiently closely, which will cause you to price yourself out of some deals and leave money on the table in others. If, OTOH, you try to match value too precisely you risk making your pricing so complicated that buyers will reject it, and you, completely.
For example, suppose you have a product that will help people do their jobs better, so you decide that charging for each user will be a good approximation for value. The potential problem is that not everyone will use your product the same, in terms of depth of functionality and/ or frequency of access. Your single per user price will be unfair to companies with long tails of light, infrequent users, for whom you'll therefore be too expensive. Conversely your pricing will be unfair to you when the customer is mostly power users. To make your pricing fairer you could have different prices for different categories of user, but then you risk being criticized for being too complex.
Jon Hall, of BMC, published an IT asset management (ITAM)-related blog (Let’s work together to fix ITAM’s image problem) in which he shares not only his insights but also what I would call “BMC IP” – what Jon calls an asset management benchmarking worksheet.
I’ve just returned home from San Francisco where I was attending the Oracle Openworld 2011 (#OOW11) event. Overall it's a good event, although, as usual, a bit frustrating. Instead of examples of how customers are using its products to transform their businesses, the Oracle keynotes always descend into technical detail, with too little vision and too many unimpressive product demonstrations and ‘paid programming’ infomercials (if I had wanted to listen to Cisco, Dell, and EMC plugging their products, I’d have gone to their events).
When, a month ago, I accepted Oracle’s invitation to attend #OOW11, I thought I’d be able to escape the oncoming British autumn for some California sunshine and watch some Redsox playoffs games on TV. Well not only did the Sox’s form plummet in September like a stock market index, but Northern California turned out to be 20° colder than London. But despite that, and the all-day Sunday trip to get to the event, one can’t help being impressed by the attendee buzz and by the logistical achievement, with over 45,000 attendees accommodated around the Bay Area and bussed in and out every day to the conference location. Luckily, Oracle looks after its analyst guests very well, so we were within walking distance at the excellent Intercontinental Hotel.
As soon as you think you understand software companies’ policies on virtualization, a new problem appears that makes you tear your hair out and scratch your now-bald head. This month’s conundrum is whether or not VMware’s ThinApp product breaches your Microsoft Windows license agreement:
However, Microsoft, via its knowledge base, claims that “Running multiple versions of Windows Internet Explorer, or portions of Windows Internet Explorer, on a single instance of Windows is an unlicensed and unsupported solution.” http://support.microsoft.com/kb/2020599/en-us#top
VMware doesn’t warn customers that ThinApp could cause them Microsoft licensing problems, but neither does it claim that it is legal. It merely advises customers to check with Microsoft.
The lines are blurring between software and services — with the rise of cloud computing, that trend has accelerated faster than ever. But customers aren’t just looking at cloud business models, such as software-as-a-service (SaaS), when they want more flexibility in the way they license and use software. While in 2008 upfront perpetual software licenses (capex) made up more than 80% of a company’s software license spending, this percentage will drop to about 70% in 2011. The other 30% will consist of different, more flexible licensing models, including financing, subscription services, dynamic pricing, risk sharing, or used license models.
Forrester is currently digging deeper into the different software licensing models, their current status in the market, as well as their benefits and challenges. We kindly ask companies that are selling software and/or software related services to participate in our ~20-minute Online Forrester Research Software Licensing Survey, letting us know about current and future licensing strategies. Of course, all answers are optional and will be kept strictly confidential. We will only use anonymous, aggregated data in our upcoming research report, and interested participants can get a consolidated upfront summary of the survey results if they chose to enter an optional email address in the survey.
As promised in a previous blog post: Which Software Licensing Policy Is The Unfairest Of Them All? , we've launched a survey to find out what sourcing and vendor management professionals think about some common software licensing policies. This isn't about bashing powerful software companies, but about building a consensus behind a campaign to bring software licensing rules up to date - i.e. protection of innocent buyers, rather than regime change. I've narrowed an initial list of 30 questionable policies down to this Foul Fifteen of candidates for the (un)coveted "Unfairest" award:
1. Double charging for external users
2. Prohibiting or overcharging for anonymous users
3. Maintenance on shelfware
4. Counting cores instead of processors
5. Counting all processors in a server, even if partitioned
6. Upfront license purchase only, not phased in line with project milestones
7. Maintenance repricing
8. Insisting on purchase of all licenses before implementation starts
9. Product enhancements packaged as new SKU’s
10. Licensing by deployment, even if unused
11. Charging for use of modules that customers cannot control or track
12. Retaining right to change licensing policies at any time
13. Multiplexing – definition is unclear or too wide
Two months ago, we announced our upcoming Forrester Forrsights Software Survey, Q4 2010. Now the data is back from more than 2,400 respondents in North America and Europe and provides us with deep and sometimes surprising insights into the software market dynamics of today and the next 24 months.
We’d like to give you a sneak preview of interesting results around some of the most important trends in the software market: cloud computing integrated information technology, business intelligence, mobile strategy, and overall software budgets and buying preferences.
Companies Start To Invest More Into Innovation In 2011
After the recent recession, companies are starting to invest more in 2011, with 12% and 22% of companies planning to increase their software budgets by more than 10% or between 5% and 10%, respectively. At the same time, companies will invest a significant part of the additional budget into new solutions. While 50% of the total software budgets are still going into software operations and maintenance (Figure 1), this number has significantly dropped from 55% in 2010; spending on new software licenses will accordingly increase from 23% to 26% and custom-development budgets from 23% to 24% in 2011.
Cloud Computing Is Getting Serious
In this year’s survey, we have taken a much deeper look into companies’ strategies and plans around cloud computing besides simple adoption numbers. We have tested to what extent cloud computing makes its way from complementary services into business critical processes, replacing core applications and moving sensitive data into public clouds.
Early next year I'm going to ask Sourcing & Vendor Management professionals to vote on which software companies' licensing policies they most resent as Unfair. Fairness is a subjective quality, but it seems to me that some policies penalize customers for circumstances beyond their control that are unrelated to the value they are getting from the software. Others have serious consequences that may not have been apparent to the buyer when he agreed to the contract. Fair software pricing charges some companies more than others, but in a logical, transparent way that is related to value. Jim Hagemann Snabe (SAP's co-CEO) explained software pricing best practice extremely well in this recent interview with Computerweekly.com's Warwick Ashford:
"Q: What is SAP doing to meet user demand for greater clarity on licensing and pricing?"