End User Computing Predictions for 2012: Civil Disobedience Escalates - Part 1

David Johnson

Sedition is simmering in the halls of corporations the world over, as the thirst for productivity and new tools grows faster than IT organizations can quench it with supply. 2012 promises to be the most transformative year for end user computing since the release of the IBM PC in 1981. The escalation of 4 trends - each individually interesting but together explosive, will bring phase changes in the way Highly Empowered and Resourceful Operatives work, and offer previously captive employees new options for productive freedom by this time next year.

  1. As in IT revolutions past, on the front lines are restless high-performers (executives, technology pros and creatives), whose nature drives them to push the limits of themselves, their tools, and their support networks, and bring their own technology to the office when their employers won't provide it. More employees will bring their own computer to the office than ever before in 2012 - most of them Macs - and if IT won't support them, they'll find another way that doesn't include IT.
  2. Cloud-based applications and services such as Dropbox and Projectplace are convincing these folks that they can get better results faster, without IT involved. And these services are priced at a point where it's cheaper than a few skinny soy chai lattes (no whip!) every week, so many employees just pay the tab themselves.
Read more

Oracle Open World Part 3 - Oracle’s “Engineered Systems”: Astute Integration Or Inspired Folly?

Richard Fichera

OK, out of respect for your time, now that I’ve caught you with a title that promises some drama I’ll cut to the chase and tell you that I definitely lean toward the former. Having spent a couple of days here at Oracle Open World poking around the various flavors of Engineered Systems, including the established Exadata and Exalogic along with the new SPARC Super Cluster (all of a week old) and the newly announced Exalytic system for big data analytics, I am pretty convinced that they represent an intelligent and modular set of optimized platforms for specific workloads. In addition to being modular, they give me the strong impression of a “composable” architecture – the various elements of processing nodes, Oracle storage nodes, ZFS file nodes and other components can clearly be recombined over time as customer requirements dictate, either as standard products or as custom configurations.

Read more

Oracle Open World Part 2 – Flash Mobs And The Quest For Performance

Richard Fichera

Well actually I meant mobs of flash, but I couldn’t resist the word play. Although, come to think of it, flash mobs might be the right way to describe the density of flash memory system vendors here at Oracle Open World. Walking around the exhibits it seems as if every other booth is occupied by someone selling flash memory systems to accelerate Oracle’s database, and all of them claiming to be: 1) faster than anything that Oracle, who already integrates flash into its systems, offers, and 2) faster and/or cheaper than the other flash vendor two booths down the aisle.

All joking aside, the proliferation of flash memory suppliers is pretty amazing, although a venue devoted to the world’s most popular database would be exactly where you might expect to find them. In one sense flash is nothing new – RAM disks, arrays of RAM configured to mimic a disk, have been around since the 1970s but were small and really expensive, and never got on a cost and volume curve to drive them into a mass-market product. Flash, benefitting not only from the inherent economies of semiconductor technology but also from the drivers of consumer volumes, has the transition to a cost that makes it a reasonable alternative for some use case, with database acceleration being probably the most compelling. This explains why the flash vendors are gathered here in San Francisco this week to tout their wares – this is the richest collection of potential customers they will ever see in one place.

Read more

Silk Browser, The BIG Leap For Amazon’s Fire, Shows Innovative Use Of App Internet

Richard Fichera

My colleague James Staten recently wrote about AutoDesk Cloud as an exemplar of the move toward App Internet, the concept of implementing applications that are distributed between local and cloud resources in a fashion that is transparent to the user except for the improved experience. His analysis is 100% correct, and AutoDesk Cloud represents a major leap in CAD functionality, intelligently offloading the inherently parallel and intensive rendering tasks and facilitating some aspects of collaboration.

But (and there’s always a “but”), having been involved in graphics technology on and off since the '80s, I would say that “cloud” implementation of rendering and analysis is something that has been incrementally evolving for decades, with hundreds of well-documented distributed environments with desktops fluidly shipping their renderings to local rendering and analysis farms that would today be called private clouds, with the results shipped back to the creating workstations. This work was largely developed and paid for either by universities and by media companies as part of major movie production projects. Some of them were of significant scale, such as “Massive,” the rendering and animation farm for "Lord of the Rings" that had approximately 1,500 compute nodes, and a subsequent installation at Weta that may have up to 7,000 nodes. In my, admittedly arguable, opinion, the move to AutoDesk Cloud, while representing a major jump in capabilities by making the cloud accessible to a huge number of users, does not represent a major architectural innovation, but rather an incremental step.

Read more

Brand Loyalty Is Declining. Total Product Experience Chains Can Help.

JP Gownder

Product strategists in many industries (from CPG to consumer electronics to financial services) share a challenge with their marketing colleagues: how to leverage the power of brand. Product strategists have a number of strategic tools in their toolboxes for differentiating their products from competitors’ offerings: features (a different taste, a new technical capability, or a higher interest rate, for instance); channel, price, or brand (or based on some combination of these factors). For the moment, let’s think about brand, because some product strategists design and build their products based largely on the promise implied by their brand name.

Forrester’s new research report – leveraging a multi-year analysis of Consumer Technographics® data – shows that while brand is important, brand loyalty (defined as the propensity to repurchase a brand) has been waning. The new report, entitled “Brand Loyalty Isn’t Enough For Products Anymore,” reveals that:

·         Brand loyalty is on the decline. Brand loyalty dropped in the U.S. from 2006 to 2010, our data shows. One reason? The Great Recession. Another? The strength of brands themselves: competing brands in the marketplace entice consumers to try new brands.

Read more

Oracle Open World Part 1 – The Circus Comes To Town And The Acts Are Great!

Richard Fichera

In the good old days, computer industry trade shows were bigger than life events – booths with barkers and actors, ice cream and espresso bars and games in the booth, magic acts and surging crowds gawking at technology. In recent years, they have for the most part become sad shadows of their former selves. The great SHOWS are gone, replaced with button-down vertical and regional events where you are lucky to get a pen or a miniature candy bar for your troubles.

Enter Oracle OpenWorld. Mix 45,000 people, hundreds of exhibitors, one of the world’s largest software and systems company looking to make an impression, and you have the new generation of technology extravaganza. The scale is extravagant, taking up the entire Moscone Center complex (N, S and W) along with a couple of hotel venues, closing off a block of a major San Francisco street for a week, and throwing a little evening party for 20 or 30 thousand people.

But mixed with the hoopla, which included wheel of fortune giveaways that had hundreds of people snaking around the already crowded exhibition floor in serpentine lines, mini golf and whack-a-mole-games in the exhibit booths along with the aforementioned espresso and ice cream stands, there was genuine content and the public face of some significant trends. So far, after 24 hours, some major messages come through loud and clear:

Read more

ITSM: People are the Problem, but People are the Solution

Glenn O'Donnell

I just spent most of this week at the annual itSMF conference called Fusion, held this year at the sprawling Gaylord National Resort below Washington DC. As always, it was a wonderful gathering of some of the finest people I know. When you’ve been involved in the IT service management field as long as I have, you get to know a LOT of these people very well. In fact, when I delivered the closing keynote of Fusion in 2009, I opened by saying, “This feels like a family reunion … except I like you more!” I was only half joking because many of these people ARE like family and I do indeed like them.

 As Forrester’s “automation guy” I often make statements about the flaws of the people in IT. I always try to inject some comedy into these statements because we have to be able to laugh at ourselves. There is a serious side to this position, however. There are now just under 7 billion idiots on this planet and none of us is exempt from that characterization. People do dumb things. We all do. Hopefully, we do more start things than dumb things. Since we do dumb things, we need to protect ourselves from ourselves.

ITSM is one of many mechanisms that offers such protection. We need ITSM because IT has rightfully earned an awful reputation for chaotic execution. It seems that IT is one of the most egregious demographic groups exemplifying human error and sloppiness. It is full of smart people doing dumb things. We in IT have a very serious problem.

Read more

DCIM And The New Reality Of Infrastructure & Operations

Richard Fichera

I recently published an update on power and cooling in the data center (http://www.forrester.com/go?docid=60817), and as I review it online, I am struck by the combination of old and new. The old – the evolution of semiconductor technology, the increasingly elegant attempts to design systems and components that can be incrementally throttled, and the increasingly sophisticated construction of the actual data centers themselves, with increasing modularity and physical efficiency of power and cooling.

The new is the incredible momentum I see behind Data Center Infrastructure Management software. In a few short years, DCIM solutions have gone from simple aggregated viewing dashboards to complex software that understands tens of thousands of components, collects, filters and analyzes data from thousands of sensors in a data center (a single CRAC may have in excess of 20 sensors, a server over a dozen, etc.) and understands the relationships between components well enough to proactively raise alarms, model potential workload placement and make recommendations about prospective changes.

Of all the technologies reviewed in the document, DCIM offers one of the highest potentials for improving overall efficiency without sacrificing reliability or scalability of the enterprise data center. While the various DCIM suppliers are still experimenting with business models, I think that it is almost essential for any data center operations group that expects significant change, be it growth, shrinkage, migration or a major consolidation or cloud project, to invest in DCIM software. DCIM consumers can expect to see major competitive action among the current suppliers, and there is a strong potential for additional consolidation.

Is IT Infrastructure & Operations Still Relevant In “The Age Of The Customer”?

Doug Washburn

Yes, but you must adapt by demonstrating your ability to drive business growth and differentiation, not just cost savings and uptime. Here’s a personal example of a much broader trend as to why this is so important to your business and your role as an I&O professional:

It’s a cool Autumn day, which reminds me I need a new jacket. I walk into Patagonia. I evaluate several models and then buy one – but not from Patagonia. It turns out a competitor located two miles away is offering the jacket at a discount. How did I know this? I scanned the product's bar code using the RedLaser app on my iPhone, which displayed several local retailers with lower prices. If I had been willing to wait three days for shipping, I could have purchased that same jacket while standing in Patagonia from an online retailer with an even better deal. [Truth be told: I actually bought the jacket from Patagonia's store after validating no better deals existed… but The Home Depot wasn’t so lucky this summer when I bought the same, but cheaper air conditioner from Amazon while standing in aisle 4.] 

This is a prime example of what Forrester calls the “The Age Of The Customer” where empowered buyers have information at their fingertips to check a price, read a product review, or ask for advice from a friend right from the screen of their smartphone. This type of technology-led disruption is eroding traditional competitive barriers across all industries; manufacturing strength, distribution power, and information mastery can't save you.

Read more

Pulling Off A Razor-Razorblade Product Strategy, Like Amazon's Product Strategists

JP Gownder

Amazon’s product strategists shocked some constituencies with their $199 price point for the Amazon Kindle Fire tablet announced today.  But there’s a fundamental product strategy lesson to this pricing, and it’s an old product strategy model:  The so-called Razor-Razorblade Pricing model.

We all know this model well, as consumers: your initial purchase of razor is relatively cheap, but the cost of replacement razorblades really adds up over time. If you don’t buy razors, perhaps you’re familiar with this scenario from your inkjet printer.  Remember how cheap that scanner/printer was -- but have you ever seen the price of refill inkjet cartridges?

The Razor-Razorblade model works when “dependent goods” – the refills, the stuff you need to keep buying to use the product – are closely related to the anchor product.  In the case of the Amazon Fire Tablet, the dependent goods are content and services – MP3s, streaming videos, and of course books, magazines, newspapers, etc. and cloud services that allow you to store and synchronize your content across devices. Amazon’s product strategists can afford to charge a low entry price to raise adoption of the device, and then (they hope) deliver an experience that’s attractive enough for Kindle Fire owners to pay for as a service.

Hence Amazon CEO Jeff Bezos’ portrayal of the Kindle Fire product strategy:  “What we are doing is offering premium products at non- premium prices,” Bezos says. Other tablet contenders “have not been competitive on price” and “have just sold a piece of hardware. We don’t think of the Kindle Fire as a tablet. We think of it as a service.”

Read more