From Product To Outcome Engagement

Peter Burris

Ah, the good ol’ days, when technology customers just wanted smaller, faster, and cheaper.  Well, they still want that, but that’s not all they want. They want business outcomes: the differentiated business capabilities that technology makes possible realized with minimized risk.

Today’s business technology buyers are embedding technology deeper into their organizations. They’re using technology to not just record business, but to uniquely mediate customer interactions, stream offerings, and shape market futures.

These differentiated business capabilities are complex, requiring customers to effect a multitude of trade-offs, implementation choices, and organizational changes. The journeys businesses take to achieve differentiated capabilities are uncertain. Outcomes, therefore, often are unknown.

Business technologists have learned the hard way that happy outcomes are not achieved simply by purchasing the right stuff. The real challenge is to successfully transform technology investments into business capabilities, at the least cost, risk, and time.

Ultimately, business technologists have learned that outcomes are co-created by vendors and users.

But most vendors are still set up primarily to sell products. Product portfolios, marketing activities, and sales behaviors still presume that customers largely are passive in the value-creation process, as though the act of buying and achieving outcomes was one and the same.

Most vendors simply do not try to sustain engagement across a customer’s entire outcome lifecycle.

Read more

Public Sector Is Hot: IT Services Providers Take Note

Jennifer Belissent, Ph.D.

The public sector is certainly hot these days – definitely in the hot seat, in hot water.  Concerns about public sector finance persist, with the discussion in some cases targeting specific causes beyond just vague notions of overspending.  The Economist recently came down pretty hard on public sector unions.

However, for some tech vendors, the public sector really is hot – as in a hot opportunity.  Despite revised earnings and warnings about public sector forecasts by some tech vendors, others are instead optimistic.  Steria, a French IT services company, is not too concerned about the lingering malaise of the public sector, although it has not been immune to the crisis. A UK public sector spending moratorium in 2010 brought all projects of more than £1 million to a temporary halt, for review.  Steria and other suppliers and service providers held their breath through much of the fall. 

Read more

Why Product Strategists Should Embrace Conjoint Analysis

JP Gownder

Aside from my work with product strategists, I’m also a quant geek. For much of my career, I’ve written surveys (to study both consumers and businesses) to delve deeply into demand-side behaviors, attitudes, and needs. For my first couple of years at Forrester, I actually spent 100% of my time helping clients with custom research projects that employed data and advanced analytics to help drive their business strategies.

These days, I use those quantitative research tools to help product strategists build winning product strategies. I have two favorite analytical approaches: my second favorite is segmentation analysis, which is an important tool for product strategists. But my very favorite tool for product strategists is conjoint analysis. If you, as a product strategist, don’t currently use conjoint, I’d like you to spend some time learning about it.

Why? Because conjoint analysis should be in every product strategist’s toolkit. Also known as feature tradeoff analysis or discrete choice, conjoint analysis can help you choose the right features for a product, determine which features will drive demand, and model pricing for the product in a very sophisticated way. It’s the gold standard for price elasticity analysis, and it offers extremely actionable advice on product design.  It helps address each of “the four Ps” that inform product strategies.

Read more

ARM-Based Servers – Looming Tsunami Or Just A Ripple In The Industry Pond?

Richard Fichera

From nothing more than an outlandish speculation, the prospects for a new entrant into the volume Linux and Windows server space have suddenly become much more concrete, culminating in an immense buzz at CES as numerous players, including NVIDIA and Microsoft, stoked the fires with innuendo, announcements, and demos.

Consumers of x86 servers are always on the lookout for faster, cheaper, and more power-efficient servers. In the event that they can’t get all three, the combination of cheaper and more energy-efficient seems to be attractive to a large enough chunk of the market to have motivated Intel, AMD, and all their system partners to develop low-power chips and servers designed for high density compute and web/cloud environments. Up until now the debate was Intel versus AMD, and low power meant a CPU with four cores and a power dissipation of 35 – 65 Watts.

The Promised Land

The performance trajectory of processors that were formerly purely mobile device processors, notably the ARM Cortex, has suddenly introduced a new potential option into the collective industry mindset. But is this even a reasonable proposition, and if so, what does it take for it to become a reality?

Our first item of business is to figure out whether or not it even makes sense to think about these CPUs as server processors. My quick take is yes, with some caveats. The latest ARM offering is the Cortex A9, with vendors offering dual core products at up to 1.2 GHz currently (the architecture claims scalability to four cores and 2 GHz). It draws approximately 2W, much less than any single core x86 CPU, and a multi-core version should be able to execute any reasonable web workload. Coupled with the promise of embedded GPUs, the notion of a server that consumes much less power than even the lowest power x86 begins to look attractive. But…

Read more

Counterintuitive Collaboration Trends 2011: Consumerization Leads The Disrupter List

Ted Schadler

It's important sometimes to step back from the obvious trends and look at things that lie just beyond the light. So in addition to the clear trends in play: mobilizing the entire collaboration toolkit, moving collaboration services to the cloud (often in support of mobile work); and consolidating collaboration workloads onto a full-featured collaboration platform, here are six counterintuitive trends for 2011 (for more detail and an analysis of what content & collaboration professionals should do, please read the full report available to Forrester clients or by credit card):

  1. Consumerization gets board-level approval. Consumerization is inevitable; your response is not. In 2011, tackle this head on. (And read our book, Empowered, while you're at it -- it has a recipe for business success in the empowered era, a world in which customers and employees have power.)
  2. The email inbox gets even more important. I know the established wisdom is for email to get less relevant as Gen Y tweets their way to business collaboration. But come on, look at all the drivers of email: feeds from social media, universal, pervasive on any device. Email's here to stay. But it's time to reinvent the inbox. IBM and Google are leading this charge.
  3. The cloud cements its role as the place for collaboration innovation. The cloud is better for mobile, telework, and distributed organizations. And cloud collaboration services will get better faster than on-premise alternatives. Full stop. The math isn't hard to do. A quarterly product release cycle beats four-year upgrade cycles and every time.
Read more

SAP Reports Q4 2010 Best Software Sales Quarter In History (But Not The Full Year)

Holger Kisker

Yesterday SAP announced its Q4 and full year 2010 revenue results.

 It's nice to see that SAP has managed the turnaround to leave the recession behind and pick up growth again. The company reported a strong 34% SW revenue growth in Q4 2010 as compared with the previous year - "The strongest software sales quarter in SAP's history" as stated by Co-CEO Bill McDermott. However, one has to keep in mind that one year ago SAP was in deep crisis and reported a YoY -15% SW revenue decline in Q4 2009 followed by the departure of CEO Léo Apotheker in February 2010 and other subsequent executive changes.

Indeed Q4 2010 was the strongest SW sales quarter in SAP's history but the fourth quarter is always the strongest in SAP's annual sales cycle. Actually Q4 SW revenue declined for 2 years since 2007 (€1,4 billion) to 2008 (€1,3 billion) to 2009 (€1,1 billion), and it was about time to turn around the curve again. While Q4 2010 was the best SW revenue quarter, the full year 2010 was still not the best in SAP's history. In 2007, SAP reported total SW revenues of $3,4 billion, followed by 2008 (€3,6 billion), 2009 (€2,6 billion), and now total SW revenue 2010 with €3,3 billion – SW revenues are still below the level of 2007! While total revenue (€12,5 billion) looks to be back on track, net new SW license revenue still remains a challenging point in SAP's balance sheet!

The new Q4 2010 revenue announcement is a very positive and promising signal, but the company needs to continue to innovate its portfolio to accelerate again new SW revenues for long-term sustained growth.

Please leave a comment or contact me directly.

Read more

Categories:

Have You Changed Your Budget/Planning Cycle? We Want To Know

Chip Gliedman

Many organizations have seen large swings over the past two years in IT spending on technology, business spending on technology, and the way that IT and business interact to best manage business technology. Have you seen changes in your budgeting and planning cycles? Does the business expect more (or less) from IT today, as compared to two years ago? How well aligned is your IT organization to goals? We’ve seen these changes in many of the organizations we’ve been speaking with.  But what about your organization? Please let us know what’s going on in your organization by taking this short survey on budgeting, planning, and alignment. If you’re a member of our CIO panel, you received an invitation to participate in this survey, and we’re hoping that you’ll let us know what’s going on in your organization.  If you’re not currently a member of the panel, you can join our panel by clicking here. Thanks. We’ll publish the results in March or April.

Categories:

Forrester EA Forum Keynotes Map EA’s Shift From IT To Business

Alex Cullen

When I started as an architect, I was part of the team called “IT Architecture.” It was clear what we did and who we did it for – we standardized technology and designs so that IT would be more reliable, deliver business solutions more quickly, and cost less. We were an IT-centric function. Then the term “Enterprise Architecture” came in – and spurred debates as to “isn’t EA about the business?,” “what’s the right scope for EA?,” and “should EA report to the CEO?” We debated it, published books and blogs about it – but it didn’t change what most architects did; they did some flavor of IT Architecture.

Meanwhile, the interplay of business and technology changed . . . Technology became embedded and central to business results, and business leaders became technology advocates. The locus of technology innovation moved from the “heavy lifting” of core system implementations to the edges of the business, where business staff see opportunities and demand more autonomy to seize them. For enterprise architects, this means that regardless of what EA has been, in the future it must become a business-focused and embedded discipline. Mapping this shift is a key theme of Forrester’s Enterprise Architecture Forum 2011

Gene Leganza, who will be presenting the opening keynote “EA In The Year 2020: Strategic Nexus Or Oblivion?,” states it this way:

Read more

Application Portfolio Management - Are You Doing It? Using Tools? Doing Without Tools?

Phil Murphy
  • If so, did you buy one?
  • What features did you find compelling? Lacking?
  • Was price a barrier to entry?

Are you using any form of APM tool today?

NetApp Acquires Akorri – Moving Up The Virtualization Stack

Richard Fichera

NetApp recently announced that it was acquiring Akorri, a small but highly regarded provider of management solutions for virtualized storage environments. All in all, this is yet another sign of the increasingly strategic importance of virtualized infrastructure and the need for existing players, regardless of how strong their positions are in their respective silos, to acquire additional tools and capabilities for management of an extended virtualized environment.

NetApp, while one of the strongest suppliers in the storage industry, not only faces continued pressure from not only EMC, which owns VMware and has been on a management software acquisition binge for years, but also renewed pressure from IBM and HP, who are increasingly tying their captive storage offerings into their own integrated virtualized infrastructure offerings. This tighter coupling of proprietary technology, while not explicitly disenfranchising external storage vendors, will still tighten the screws slightly and reduce the number of opportunities for NetApp to partner with them. Even Dell, long regarded as the laggard in high-end enterprise presence, has been ramping up its investment management and ability to deliver integrated infrastructure, including both the purchase of storage technology and a very clear signal with its run at 3Par and recent investments in companies such as Scalent (see my previous blog on Dell as an enterprise player and my colleague Andrew Reichman’s discussion of the 3Par acquisition) that it wants to go even further as a supplier of integrated infrastructure.

Read more