What Is Your IT Strategy To Win In The Age Of The Customer?

Doug Washburn

Consider the following scenario: It’s a hot summer day and a prospective customer walks into your store to buy an air conditioner. He evaluates several models and then buys one — but not from you. It turns out your competitor located two miles away is offering the same model at a 20% discount. How did he know this? He scanned the product's bar code using the RedLaser app on his iPhone, which displayed several local retailers with lower prices than yours. If he had been willing to wait three days for shipping, he could have purchased the exact same model while standing in your store from an online retailer at a 30% discount.

This type of technology-fueled disruption is affecting all industries, not just retailers. Since the early 1900s, businesses relied on competitive barriers such as manufacturing strength, distribution power, and information mastery. But this is all changing in the age of the customer, where empowered buyers have information at their fingertips to check a price, read a product review, or ask for advice from a friend right from the screen of their smartphone.

To compete in the age of the customer, your business must become customer-obsessed. As Forrester’s Josh Bernoff (@jbernoff), SVP of Idea Development and author of Groundswell and Empowered, advocates in his latest research: “The only source of competitive advantage is the one that can survive technology-fueled disruption — an obsession with understanding, delighting, connecting with, and serving customers.”

Read more

Question On BI Total Cost Of Ownership

Boris Evelson

I need your help. I am conducting research into business intelligence (BI) software prices: averages, differences between license and subscription deals, differences between small and large vendor offerings, etc. In order to help our clients look beyond just the software pricese and consider the fully loaded total cost of ownership, I also want to throw in service and hardware costs (I already have data on annual maintenance and initial training costs). I’ve been in this market long enough to understand that the only correct answer is “It depends” — on the levels of data complexity, data cleanliness, use cases, and many other factors. But, if I could pin you down to a ballpark formula for budgeting and estimation purposes, what would that be? Here are my initial thoughts — based on experience, other relevant research, etc.

  • Initial hardware as a percentage of software cost = 33% to 50%
  • Ongoing hardware maintenance = 20% of the initial hardware cost
  • Initial design, build, implementation of services. Our rule of thumb has always been 300% to 700%, but that obviously varies by deal sizes. So here’s what I came up with:
    • Less than $100,000 in software = 100% in services
    • $100,000 to $500,000 in software = 300% in services
    • $500,000 to $2 million in software = 200% in services
    • $2 million to $10 million in software = 50% in services
    • More than $10 million in software = 25% in services
  • Then 20% of the initial software cost for ongoing maintenance, enhancements, and support

Thoughts? Again, I am  not looking for “it depends” answers, but rather for some numbers and ranges based on your experience.

Do Not Depend On EA To Innovate

Brian  Hopkins

Many organizations expect EAs to be the source of technology innovations. They are broadly knowledgeable, experienced, connect-the-dots kind of people you might naturally expect to come up with reasonable ideas for new approaches and technology. When you think about it a bit, this expectation is misplaced. Here’s why I think this:

The best technology innovators are users who have a problem to solve; motivation to solve a specific problem affecting their lives is the key ingredient. EAs just don’t have these kinds of problems; because they operate as a bridge between business and technology, most often they are attempting to solve things that affect other people’s lives. Please don’t get me wrong: EAs are always looking for new, innovative ways to improve things. But this doesn’t replace the “I gotta fix this now” kind of motivation inspiring most innovations.

So am I saying organizations should take EAs out of the innovator role? Yes and no.

Here at Forrester, we have been writing and talking about topics such as Innovation Networks and new roles for business technology for a while. I think that EAs are better placed at the center of an Innovation Network where they connect innovation suppliers (lead users who are dreaming up new ways to solve their problems) with innovation users (other folks who can benefit from a generalization of the solutions the suppliers come up with). In addition, EAs can bring innovation implementers — the team members who know how to actually make innovations into solutions that work for more than just one individual or group — into the conversation.

So what should you do?

  1. Send EAs on a mission to find people doing innovative things in IT and the business. This has a side effect of connecting EAs to the frontlines, where they might discover all kinds of things.
Read more

Categories:

Thoughts On Strategic Partnerships From Infosys Leaders And Clients

Duncan Jones

I’m in Las Vegas attending Infosys’s Connect 2011 client event, and one of the recurring themes in sessions and side conversations has been the nature of Strategic Partnership. The phrase risks becoming a meaningless cliché, so I was interested to research what it actually means to Infosys execs and clients. I got some interesting, varied perspectives.

A large CPG company’s central IT group described its interpretation in a couple of sessions. It demands, among other things, a strong cultural fit, a commitment to win:win solutions to problems, and regular meetings with partners’ CEOs. This group has 12 “strategic partners” who get a lead role in a specific area, but may not even be considered in other areas, even though they have good solutions in their portfolio. I might argue the semantic point about whether this means they are merely ‘important, at the moment’ rather than ‘strategic’. However, the key point is that the two parties’ commitment to making the partnership work creates a better, stronger commercial framework than any legal agreement could deliver.

Raj Joshi, MD of Infosys Consulting, described his group’s Value Realization Method (VRM) that formally tracks each project’s expected business benefits from the initial project business case through design and implementation and onto ongoing value delivery. Joshi stressed the importance of shared incentives, such as risk/ reward sharing commercial models, in ensuring projects’ success.

Read more

Hyper-V Matures As An Enterprise Platform

Richard Fichera

A project I’m working on for an approximately half-billion dollar company in the health care industry has forced me to revisit Hyper-V versus VMware after a long period of inattention on my part, and it has become apparent that Hyper-V has made significant progress as a viable platform for at least medium enterprises. My key takeaways include:

  • Hyper-V has come a long way and is now a viable competitor in Microsoft environments up through mid-size enterprise as long as their DR/HA requirements are not too stringent and as long as they are willing to use Microsoft’s Systems Center, Server Management Suite and Performance Resource Optimization as well as other vendor specific pieces of software as part of their management environment.
  • Hyper-V still has limitations in VM memory size, total physical system memory size and number of cores per VM compared to VMware, and VMware boasts more flexible memory management and I/O options, but these differences are less significant that they were two years ago.
  • For large enterprises and for complete integrated management, particularly storage, HA, DR and automated workload migration, and for what appears to be close to 100% coverage of workload sizes, VMware is still king of the barnyard. VMware also boasts an incredibly rich partner ecosystem.
  • For cloud, Microsoft has a plausible story but it is completely wrapped around Azure.
  • While I have not had the time (or the inclination, if I was being totally honest) to develop a very granular comparison, VMware’s recent changes to its legacy licensing structure (and subsequent changes to the new pricing structure) does look like license cost remains an attraction for Microsoft Hyper-V, especially if the enterprise is using Windows Server Enterprise Edition. 
Read more

Catching Up With SUSE -- The Attachmate Group Clarifies Branding And Role For SUSE

Richard Fichera

I recently had an opportunity to spend some time with SUSE management, including President and General Manager Nils Brauckmann, and came away with what I think is a reasonably clear picture of The Attachmate Group’s (TAG) intentions and of SUSE’s overall condition these days. Overall, impressions were positive, with some key takeaways:

  • TAG has clarified its intentions regarding SUSE. TAG has organized its computer holdings as four independent business units, Novell, NetIQ, Attachmate and SUSE, each one with its own independent sales, development, marketing, etc. resources. The advantages and disadvantages of this approach are pretty straightforward, with the lack of opportunity to share resources aiming the business units for R&D and marketing/sales being balanced off by crystal clear accountability and the attendant focus it brings. SUSE management agrees that it has undercommunicated in the past, and says that now that the corporate structure has been nailed down it will be very aggressive in communicating its new structure and goals.
  • SUSE’s market presence has shifted to a more balanced posture. Over the last several years SUSE has shifted to a somewhat less European-centric focus, with 50% of revenues coming from North America, less than 50% from EMEA, and claims to be the No. 1 Linux vendor in China, where it has expanded its development staffing. SUSE claims to have gained market share overall, laying claim to approximately 30% of WW Linux market share by revenue.
  • Focus on enterprise and cloud. Given its modest revenues of under $200 million, SUSE realizes that it cannot be all things to all people, and states that it will be focusing heavily on enterprise business servers and cloud technology, with less emphasis on desktops and projects that do not have strong financial returns, such as its investment in Mono, which it has partnered with Xamarin to continue development,.
Read more

S&P Downgrade Of US Debt And Related Financial Market Distress Mean Slower Growth In US And Global Tech Markets

Andrew Bartels

The financial news from the US and Europe – the messy resolution of the US debt ceiling impasse and the related downgrade of US government securities, the sharply higher prices for Spanish and Italian debt after inadequate response to the latest Greek debt crisis, and the big drops in stock markets on Monday – will certainly weaken the economic growth prospects of both the US and Europe. We anticipated much of this two weeks ago, both before the US debt ceiling was raised at the 11th hour along with a makeshift deficit reduction plan (see my blog on July 28, 2011) and after the news of much lower US economic came out on Friday (see my blog on July 29, 2011). In fact, the resolution to the debt ceiling issue was slightly better than we expected (no default, and in interim deficit reduction that cut only $21 billion in fiscal year 2012 starting in October 2011) while the US economic outlook in Q2 2011 and earlier was quite a bit worse. The big surprise was S&P's downgrade of US securities from AAA to AA+. While that downgrade was not copied by the other rating agencies and in fact had no impact today on the prices of US treasury securities, it had a big psychological impact. Along with the bad news coming out of Europe after interest rates on Spanish and Italian debt spiked, the S&P downgrade triggered the 600 point or so drop in the Dow Jones Industrial index today, following a 500-point fall on Friday. The result of all these events at best will mean very weak growth in both the US and Europe in the rest of 2011 and well into 2012; at worse, it increases the risk of a renewed recession.

Read more

Venices (And Singapores) Of The World: Imitation And Innovation

Jennifer Belissent, Ph.D.

We often hear of city comparisons.  In my many years in Russia, I must have heard that St. Petersburg was the Venice of the North hundreds of times.  Another is Paris.  How many times have you heard “[Insert city] is the Paris of the [insert region]”?  Actually, a quick search reveals that there are at least 11 cities that are “the Paris of the East.”  Some are quite surprising:

Read more

GPU Case Study Highlights Financial Application Acceleration

Richard Fichera

NVIDIA recently shared a case study involving risk calculations at a JP Morgan Chase that I think is significant for the extreme levels of acceleration gained by integrating GPUs with conventional CPUs, and also as an illustration of a mainstream financial application of GPU technology.

JP Morgan Chase’s Equity Derivatives Group began evaluating GPUs as computational accelerators in 2009, and now runs over half of their risk calculations on hybrid systems containing x86 CPUs and NVIDIA Tesla GPUs, and claims a 40x improvement in calculation times combined with a 75% cost savings. The cost savings appear to be derived from a combination of lower capital costs to deliver an equivalent throughput of calculations along with improved energy efficiency per calculation.

Implicit in the speedup of 40x, from multiple hours to several minutes, is the implication that these calculations can become part of a near real-time business-critical analysis process instead of an overnight or daily batch process. Given the intensely competitive nature of derivatives trading, it is highly likely that JPMC will enhance their use of GPUs as traders demand an ever increasing number of these calculations. And of course, their competition has been using the same technology as well, based on numerous conversations I have had with Wall Street infrastructure architects over the past year.

My net take on this is that we will see a succession of similar announcements as GPUs become a fully mainstream acceleration technology as opposed to an experimental fringe. If you are an I&O professional whose users are demanding extreme computational performance on a constrained space, power and capital budget, you owe it to yourself and your company to evaluate the newest accelerator technology. Your competitors are almost certainly doing so.

The Business Architect Cometh

John R. Rymer

[Forrester Principal Analyst Alexander Peters, PhD. and I collaborated on this research.]

You may have heard the term "business architect" in your travels; if you haven't, you soon will. A significant number of our clients are searching for these new leaders. Broadly, BAs are responsible for developing and managing an organization's business model and business technology (BT) agenda. The business architect fills the gap between business management and IT management. One way to bridge the gap is to make it someone's responsibility to do so.

In our research, we spoke to individuals occupying this crucial role in a large European agency, a large financial services firm, a regional healthcare provider, a diversified energy provider, and a logistics firm. The need for BAs is most acute in organizations that are in the midst of transforming their businesses and information systems.

The need for business architects is manifest, but what's less apparent to these firms is how this role should be structured, who should occupy it, and what a BA's responsibilities should be (as illustrated by wide variations in job ads for the position). In our research, we found two established models for the BA role:

Read more