Don’t Underestimate The Value Of Information, Documentation, And Expertise!

Andre Kindness

With all the articles written about IPv4 addresses running out, Forrester’s phone lines are lit up like a Christmas tree. Clients are asking what they should do, who they should engage, and when they should start embracing IPv6. Like the old adage “It takes a village to raise a child,” Forrester is only one component; therefore, I started to compile a list of vendors and tactical documentation links that would help customers transition to IPv6. As I combed through multiple sites, the knowledge and documentation chasm between vendors became apparent. If the vendor doesn’t understand your business goals or have the knowledge to solve your business issues, are they a good partner? Are acquisition and warranty costs the only or largest considerations to making a change to a new vendor? I would say no.

Support documentation and availability to knowledge is especially critical in networking design, deployment, maintenance, and upgrades. Some pundits have relegated networking to a commodity play, but networking is more than plumbing. It’s the fabric that supports a dynamic business connecting users to services that are relevant to the moment, are aggregated at the point of use, and originate from multiple locations. The complexity has evolved from designing in a few links to tens of hundreds of relationships (security, acceleration, prioritization, etc.) along the flow of apps and data through a network. Virtualization, convergence, consolidation, and the evolving data center networks are prime examples of today’s network complexity. In response to this complexity, architects and practitioners turn to books, training materials, blogs, and repositories so that they can:

  • Set up an infrastructure more quickly or with a minimal number of issues, since there is a design guide or blueprint.
Read more

AMD Bumps Its Specs, Waits For Interlagos And Bulldozer

Richard Fichera

Since its introduction of its Core 2 architecture, Intel reversed much of the damage done to it by AMD in the server space, with attendant publicity. AMD, however, has been quietly reclaiming some ground with its 12-core 6100 series CPUs, showing strength in  benchmarks that emphasize high throughput in process-rich environments as opposed to maximum performance per core. Several AMD-based system products have also been cited by their manufacturers to us as enjoying very strong customer acceptance due to the throughput of the 12-core CPUs combined with their attractive pricing. As a fillip to this success, AMD this past week announced speed bumps for the 6100-series products to give a slight performance boost as they continue to compete with Intel’s Xeon 5600 and 7500 products (Intel’s Sandy Bridge server products have not yet been announced).

But the real news last week was the quiet subtext that the anticipated 16-core Interlagos products based on the new Bulldozer core appear to be on schedule for Q2 ’11 shipments system partners, who should probably be able to ship systems during Q3, and that AMD is still certifying them as compatible with the current sockets used for the 12-core 6000 CPUs. This implies that system partners will be able to quickly deliver products based on the new parts very rapidly.

Actual performance of these systems will obviously be dependent on the workloads being run, but our gut feeling is that while they will not rival the per-core performance of the Intel Xeon 7500 CPUs, for large throughput-oriented environments with high numbers of processes, a description that fits a large number of web and middleware environments, these CPUs, each with up to a 50% performance advantage per core over the current AMD CPUs, may deliver some impressive benchmarks and keep the competition in the server  space at a boil, which in the end is always helpful to customers.

The Passing Of A Giant – Digital Equipment Founder Ken Olsen Dead At 84

Richard Fichera

One evening in 1972 I was hanging out in the computer science department at UC Berkeley with a couple of equally socially backward friends waiting for our batch programs to run, and to kill some time we dropped in on a nearby physics lab that was analyzing photographs of particle tracks from one of the various accelerators that littered the Lawrence Radiation Laboratory. Analyzing these tracks was real scut work – the overworked grad student had to measure angles between tracks, length of tracks, and apply a number of calculations to them to determine if they were of interest. To our surprise, this lab had something we had never seen before – a computer-assisted screening device that scanned the photos and in a matter of seconds determined it had any formations that were of interest. It had a big light table, a fancy scanner, whirring arms and levers and gears, and off in the corner, the computer, “a PDP from Digital Equipment.” It was a 19” rack mount box with an impressive array of lights and switches on the front. As a programmer of the immense 1 MFLOP CDC 6400 in the Rad Lab computer center, I was properly dismissive…

This was a snapshot of the dawn of the personal computer era, almost a decade before IBM Introduced the PC and blew it wide open. The PDP (Programmable Data Processor) systems from MIT Professor Ken Olsen were the beginning of the fundamental change in the relationship between man and computer, putting a person in the computing loop instead of keeping them standing outside the temple.

Read more

The ITSM Selection Process

Eveline Oehrlich

Almost every day I get the question: “We want to replace our ITSM support tool; which vendor should I look at?” There are many alternatives today and each vendor has certainly done a great amount of work to position themselves as the best. The success I had in consulting with these clients, and the knowledge I carry with me now, is thanks in part to the clients with whom I have discussed the ITSM space. They have all confirmed that the functionality across these vendors is very similar. This, however, does not help in decision-making — so I’m especially excited to have authored a three-piece research document which might take some magic out of the decision process when selecting ITSM support tools in the future.

This Forrester report is called Eliminate Magic When Selecting The Right IT Service Management (ITSM) Support Tool.  It’s an overview of the process decision-makers need to follow and the important — but sometimes overlooked — other criteria to keep in mind as they work toward launching or engaging with the ITSM vendor community.

I identified four phases of the evaluation process that should be followed:

Plan: Lay the groundwork, set objectives, explore existing conversations, and make necessary early decisions.

Assemble an evaluation team: Putting the right people together to understand the use cases and requirements is critical before the next step.

Define your requirements: Use the ITSM Support Tools Product Comparison to define your requirements.

Read more


IBM And ARM Continue Their Collaboration – Major Win For ARM

Richard Fichera

Last week IBM and ARM Holdings Plc quietly announced a continuation of their collaboration on advanced process technology, this time with a stated goal of developing ARM IP optimized for IBM physical processes down to a future 14 nm size. The two companies have been collaborating on semiconductors and SOC design since 2007, and this extension has several important ramifications for both companies and their competitors.

It is a clear indication that IBM retains a major interest in low-power and mobile computing, despite its previous divestment of its desktop and laptop computers to Lenovo, and that it will be in a position to harvest this technology, particularly ARM's modular approach to composing SOC systems, for future productization.

For ARM, the implications are clear. Its latest announced product, the Cortex A15, which will probably appear in system-level products in approximately 2013, will be initially produced in 32 nm with a roadmap to 20nm. The existence of a roadmap to a potential 14 nm product serves notice that the new ARM architecture will have a process roadmap that will keep it on Intel’s heels for another decade. ARM has parallel alliances with TSMC and Samsung as well, and there is no reason to think that these will not be extended, but the IBM alliance is an additional insurance policy. As well as a source of semiconductor technology, IBM has a deep well of systems and CPU IP that certainly cannot hurt ARM.

Read more

POST: Refining Your Strategy For iPads and Tablets -- The Workshop!

JP Gownder

Are you a product strategist trying to craft an iPad (or general tablet) product strategy?  For example, are you thinking about creating an app to extend your product proposition using the iPad or other tablet computer?

At Forrester, we’ve noticed that product strategists in a wide variety of verticals – media, retail, travel, consumer products, financial services, pharmaceuticals, software, and many others – are struggling to make fundamental decisions about how the iPad (and newer tablets based on Android, Windows, webOS, RIM’s QNX, and other platforms) will affect their businesses.

To help these clients, an analyst on my team, Sarah Rotman Epps, has designed a one-day Workshop that she’ll be conducting twice, on February 8th and February 9th, in Cambridge, Massachusetts.

She’ll be helping clients answer fundamental questions, such as:

  • Do we need to develop an iPad app for our product/service/website? If we don't build an app, what else should we do?
  • What are the best practices for development app products for the iPad? What are the features of these best-in-class app products?
  • Which tablet platforms should we prioritize for development, aside from the iPad?
  • Which tablets will be the strongest competitors to the iPad?
Read more

Vendors Must Modify Strategies To Reach New Segments Of Mobile Workers

Michele Pelino

Vendors in the mobility ecosystem are dramatically underestimating the demand for mobility solutions in the corporate arena. Why? Because they are missing demand that will come from two emerging segments of employees: Mobile Wannabes and Mobile Mavericks. When combined, these two worker segments account for 22% of all employees today, but by 2015 they will grow significantly to 42% of all corporate employees. To identify the needs of these mobile workers, Forrester analyzed results from the Forrsights Workforce Employee Survey, Q3 2010, which was fielded to over 5,500 employees in Canada, France, Germany, the UK, and the US and captures their smartphone device usage, purchasing behavior, and mobile application adoption.

Mobile Wannabe employees work in desk jobs at an office and do not get mobile devices from the corporate IT department, but they “want to” use their smartphone devices for work. Today, Mobile Wannabe workers account for 16% of all employees worldwide; however, by 2015, this segment will account for nearly 30% of all employees. Wannabe worker roles include executive assistants, clerical personnel, human resource workers, and customer service representatives. Momentum in this segment is driven by Millennial workers who grew up having easy access to personal computers and mobile phones and often purchase smartphones prior to entering the workforce.

Read more

Why Product Strategists Should Embrace Conjoint Analysis

JP Gownder

Aside from my work with product strategists, I’m also a quant geek. For much of my career, I’ve written surveys (to study both consumers and businesses) to delve deeply into demand-side behaviors, attitudes, and needs. For my first couple of years at Forrester, I actually spent 100% of my time helping clients with custom research projects that employed data and advanced analytics to help drive their business strategies.

These days, I use those quantitative research tools to help product strategists build winning product strategies. I have two favorite analytical approaches: my second favorite is segmentation analysis, which is an important tool for product strategists. But my very favorite tool for product strategists is conjoint analysis. If you, as a product strategist, don’t currently use conjoint, I’d like you to spend some time learning about it.

Why? Because conjoint analysis should be in every product strategist’s toolkit. Also known as feature tradeoff analysis or discrete choice, conjoint analysis can help you choose the right features for a product, determine which features will drive demand, and model pricing for the product in a very sophisticated way. It’s the gold standard for price elasticity analysis, and it offers extremely actionable advice on product design.  It helps address each of “the four Ps” that inform product strategies.

Read more

ARM-Based Servers – Looming Tsunami Or Just A Ripple In The Industry Pond?

Richard Fichera

From nothing more than an outlandish speculation, the prospects for a new entrant into the volume Linux and Windows server space have suddenly become much more concrete, culminating in an immense buzz at CES as numerous players, including NVIDIA and Microsoft, stoked the fires with innuendo, announcements, and demos.

Consumers of x86 servers are always on the lookout for faster, cheaper, and more power-efficient servers. In the event that they can’t get all three, the combination of cheaper and more energy-efficient seems to be attractive to a large enough chunk of the market to have motivated Intel, AMD, and all their system partners to develop low-power chips and servers designed for high density compute and web/cloud environments. Up until now the debate was Intel versus AMD, and low power meant a CPU with four cores and a power dissipation of 35 – 65 Watts.

The Promised Land

The performance trajectory of processors that were formerly purely mobile device processors, notably the ARM Cortex, has suddenly introduced a new potential option into the collective industry mindset. But is this even a reasonable proposition, and if so, what does it take for it to become a reality?

Our first item of business is to figure out whether or not it even makes sense to think about these CPUs as server processors. My quick take is yes, with some caveats. The latest ARM offering is the Cortex A9, with vendors offering dual core products at up to 1.2 GHz currently (the architecture claims scalability to four cores and 2 GHz). It draws approximately 2W, much less than any single core x86 CPU, and a multi-core version should be able to execute any reasonable web workload. Coupled with the promise of embedded GPUs, the notion of a server that consumes much less power than even the lowest power x86 begins to look attractive. But…

Read more

NetApp Acquires Akorri – Moving Up The Virtualization Stack

Richard Fichera

NetApp recently announced that it was acquiring Akorri, a small but highly regarded provider of management solutions for virtualized storage environments. All in all, this is yet another sign of the increasingly strategic importance of virtualized infrastructure and the need for existing players, regardless of how strong their positions are in their respective silos, to acquire additional tools and capabilities for management of an extended virtualized environment.

NetApp, while one of the strongest suppliers in the storage industry, not only faces continued pressure from not only EMC, which owns VMware and has been on a management software acquisition binge for years, but also renewed pressure from IBM and HP, who are increasingly tying their captive storage offerings into their own integrated virtualized infrastructure offerings. This tighter coupling of proprietary technology, while not explicitly disenfranchising external storage vendors, will still tighten the screws slightly and reduce the number of opportunities for NetApp to partner with them. Even Dell, long regarded as the laggard in high-end enterprise presence, has been ramping up its investment management and ability to deliver integrated infrastructure, including both the purchase of storage technology and a very clear signal with its run at 3Par and recent investments in companies such as Scalent (see my previous blog on Dell as an enterprise player and my colleague Andrew Reichman’s discussion of the 3Par acquisition) that it wants to go even further as a supplier of integrated infrastructure.

Read more