I’m conducting research for our sourcing clients about the right approach to take in IBM software audits. In combing through Forrester’s IBM software sourcing inquiries, I’ve found that most sourcing professionals don't realize they have a licensing problem until it's too late — and they’re struggling to get quick information and guidance.
Some of the problems are vendor-driven: Market expectations keep the IBM Software Group on a high growth trajectory, and license audits contribute to their objectives. Others are client-driven: IBM's clients should expect audits that reconcile license discrepancies but then struggle to leverage resources to remain compliant. Regardless of the underlying reasons, these audits often end up in the same place — clients who are drowning in contract administration and compliance costs and frustrated with their IBM relationship.
As part of my research on the auditing process, I’ve been interviewing some former IBM sales reps, and I’m seeing a few trends. Some of my preliminary findings indicate:
Sales teams often don’t have control over who’s audited . . . We spoke with one former IBM sales rep who noted that sales reps don’t have much control over auditing activity. He told us the audit department creates an audit letter and a spreadsheet of clients, pushes that to the sales team, and asks them to find the audit targets. This rep indicated that the auditing team asks sales to continue to call into this exec until they agreed to the audit, at which time it’s handed back to the audit team. So your rep may not be deeply connected to the process behind your audit.
A project I’m working on for an approximately half-billion dollar company in the health care industry has forced me to revisit Hyper-V versus VMware after a long period of inattention on my part, and it has become apparent that Hyper-V has made significant progress as a viable platform for at least medium enterprises. My key takeaways include:
Hyper-V has come a long way and is now a viable competitor in Microsoft environments up through mid-size enterprise as long as their DR/HA requirements are not too stringent and as long as they are willing to use Microsoft’s Systems Center, Server Management Suite and Performance Resource Optimization as well as other vendor specific pieces of software as part of their management environment.
Hyper-V still has limitations in VM memory size, total physical system memory size and number of cores per VM compared to VMware, and VMware boasts more flexible memory management and I/O options, but these differences are less significant that they were two years ago.
For large enterprises and for complete integrated management, particularly storage, HA, DR and automated workload migration, and for what appears to be close to 100% coverage of workload sizes, VMware is still king of the barnyard. VMware also boasts an incredibly rich partner ecosystem.
For cloud, Microsoft has a plausible story but it is completely wrapped around Azure.
While I have not had the time (or the inclination, if I was being totally honest) to develop a very granular comparison, VMware’s recent changes to its legacy licensing structure (and subsequent changes to the new pricing structure) does look like license cost remains an attraction for Microsoft Hyper-V, especially if the enterprise is using Windows Server Enterprise Edition.
I recently had an opportunity to spend some time with SUSE management, including President and General Manager Nils Brauckmann, and came away with what I think is a reasonably clear picture of The Attachmate Group’s (TAG) intentions and of SUSE’s overall condition these days. Overall, impressions were positive, with some key takeaways:
TAG has clarified its intentions regarding SUSE. TAG has organized its computer holdings as four independent business units, Novell, NetIQ, Attachmate and SUSE, each one with its own independent sales, development, marketing, etc. resources. The advantages and disadvantages of this approach are pretty straightforward, with the lack of opportunity to share resources aiming the business units for R&D and marketing/sales being balanced off by crystal clear accountability and the attendant focus it brings. SUSE management agrees that it has undercommunicated in the past, and says that now that the corporate structure has been nailed down it will be very aggressive in communicating its new structure and goals.
SUSE’s market presence has shifted to a more balanced posture. Over the last several years SUSE has shifted to a somewhat less European-centric focus, with 50% of revenues coming from North America, less than 50% from EMEA, and claims to be the No. 1 Linux vendor in China, where it has expanded its development staffing. SUSE claims to have gained market share overall, laying claim to approximately 30% of WW Linux market share by revenue.
Focus on enterprise and cloud. Given its modest revenues of under $200 million, SUSE realizes that it cannot be all things to all people, and states that it will be focusing heavily on enterprise business servers and cloud technology, with less emphasis on desktops and projects that do not have strong financial returns, such as its investment in Mono, which it has partnered with Xamarin to continue development,.
You may have heard the term “business architect” in your travels; if you haven’t, you soon will. This summer, I have watched, and sometimes been involved in, several emotional debates among enterprise and information architects, business analysts, quality managers, Lean Six Sigma experts, management consultants, and IT consultants about the future and origins of their jobs, the skills they need, and, most importantly, their career paths to becoming a business architect.
There’s little doubt that these discussions are critically important to these individuals. Just as interesting from a research perspective is this question: What business problem do business architects need to resolve?
I have recently worked on two research projects addressing this question. For the first one, performed jointly with principal analyst John R. Rymer, our motivation came from a consulting case: Our client had experienced significant extra costs and process instabilities in operations and asked us for advice when a business transformation initiative supported by innovative technologies got out of control.
For the the second research project, principal analyst Derek Miers and I surveyed more than 300 business process professionals on their goals, priorities, and the maturity of their business process change programs. Using the collected data, we correlated the maturity assessment with the availability of business architecture functions.
The financial news from the US and Europe – the messy resolution of the US debt ceiling impasse and the related downgrade of US government securities, the sharply higher prices for Spanish and Italian debt after inadequate response to the latest Greek debt crisis, and the big drops in stock markets on Monday – will certainly weaken the economic growth prospects of both the US and Europe. We anticipated much of this two weeks ago, both before the US debt ceiling was raised at the 11th hour along with a makeshift deficit reduction plan (see my blog on July 28, 2011) and after the news of much lower US economic came out on Friday (see my blog on July 29, 2011). In fact, the resolution to the debt ceiling issue was slightly better than we expected (no default, and in interim deficit reduction that cut only $21 billion in fiscal year 2012 starting in October 2011) while the US economic outlook in Q2 2011 and earlier was quite a bit worse. The big surprise was S&P's downgrade of US securities from AAA to AA+. While that downgrade was not copied by the other rating agencies and in fact had no impact today on the prices of US treasury securities, it had a big psychological impact. Along with the bad news coming out of Europe after interest rates on Spanish and Italian debt spiked, the S&P downgrade triggered the 600 point or so drop in the Dow Jones Industrial index today, following a 500-point fall on Friday. The result of all these events at best will mean very weak growth in both the US and Europe in the rest of 2011 and well into 2012; at worse, it increases the risk of a renewed recession.
Josh Bernoff, one of Forrester’s leading analysts, spotlights in a new report that we have now entered the age of the customer. Empowered customers are disrupting every industry; competitive barriers like manufacturing strength, distribution power, and information mastery can’t save you. In this age of the customer, the only sustainable competitive advantage is knowledge of and engagement with customers. The successful companies will be customer-obsessed, like Best Buy, IBM, and Amazon.com. Executives in customer-obsessed companies must pull budget dollars from areas that traditionally created dominance — brand advertising, distribution lockup, mergers for scale, and supplier relationships — and invest in four priority areas: 1) real-time customer intelligence; 2) customer experience and customer service; 3) sales channels that deliver customer intelligence; and 4) useful content and interactive marketing. Those that master the customer data flow and improve frontline customer staff will have the edge.
We often hear of city comparisons. In my many years in Russia, I must have heard that St. Petersburg was the Venice of the North hundreds of times. Another is Paris. How many times have you heard “[Insert city] is the Paris of the [insert region]”? Actually, a quick search reveals that there are at least 11 cities that are “the Paris of the East.” Some are quite surprising:
Data management and BI professionals often feel pressure from senior management to propose and start implementing master data management (MDM), data quality, data warehousing, business intelligence (BI), analytics or other data management strategies quickly, without time to perform the necessary due diligence. These “fire drill” strategy sessions may arise as a reaction to a compelling event like a compliance or regulatory action, the need to support better management planning and decision-making during economic struggles, or even by the arrival of a new senior executive (e.g., CEO, CIO, CFO, COO, CMO) looking to make their mark on the organization by driving this strategy.
Unfortunately the program drivers on the hook to deliver these catch-up strategy planning initiatives tend to disregard many best practices in the process. Can you blame them? Many of them have been the organizational evangelists that have fought for months – or even years – to get sponsorship and investment to deliver these solutions. When that support finally arrives, they’d be crazy to turn it away just because the timelines are a bit aggressive, right? Well yes, they should push back if the solution they’re building will not:
Deliver a clear ROI to deliver clear business value with a line of sight to how the capabilities will improve efficiencies, reduce cost, reduce risk, increase revenue, or strategically differentiate your organization. Think that executive sponsor will have your back if you can’t prove the value? Think again.
Scale and offer the flexibility and agility to support the next set of incremental requirements or users that will inevitably come along.
Guarantee end user adoption and acceptance of the new solution that will likely introduce new processes, technologies, and/or organizational changes.
NVIDIA recently shared a case study involving risk calculations at a JP Morgan Chase that I think is significant for the extreme levels of acceleration gained by integrating GPUs with conventional CPUs, and also as an illustration of a mainstream financial application of GPU technology.
JP Morgan Chase’s Equity Derivatives Group began evaluating GPUs as computational accelerators in 2009, and now runs over half of their risk calculations on hybrid systems containing x86 CPUs and NVIDIA Tesla GPUs, and claims a 40x improvement in calculation times combined with a 75% cost savings. The cost savings appear to be derived from a combination of lower capital costs to deliver an equivalent throughput of calculations along with improved energy efficiency per calculation.
Implicit in the speedup of 40x, from multiple hours to several minutes, is the implication that these calculations can become part of a near real-time business-critical analysis process instead of an overnight or daily batch process. Given the intensely competitive nature of derivatives trading, it is highly likely that JPMC will enhance their use of GPUs as traders demand an ever increasing number of these calculations. And of course, their competition has been using the same technology as well, based on numerous conversations I have had with Wall Street infrastructure architects over the past year.
My net take on this is that we will see a succession of similar announcements as GPUs become a fully mainstream acceleration technology as opposed to an experimental fringe. If you are an I&O professional whose users are demanding extreme computational performance on a constrained space, power and capital budget, you owe it to yourself and your company to evaluate the newest accelerator technology. Your competitors are almost certainly doing so.
I just bought an Apple MacBook Air. As a Windows power user, I worried some about transitioning from my Windows environment to this newfangled, alien-looking device. Happily, it has been a no-brainer. Although I haven't figured out everything, I've embraced the Mac environment relatively easily, despite the fact that Microsoft has entwined itself in my DNA over the past two decades. I'm very happy with my new friend.
Finding the right MacBook case, however, is a different story. I don't know if I want a neoprene zipper bag or an over-the-shoulder messenger bag or the case that disguises your MacBook as a hardback book to confuse potential thieves. It must be light, and it must be fabulous. After hitting more than ten sites with many options but no way to filter for my needs, I think I'll make my own.