The Gamification Of Business

Nigel Fenwick
GamificationThe most engaging, most entertaining, and most stimulating presentation of IBM Connect 2013 came on the third day at the end of the opening session. I'm ashamed to admit that I didn't know Jane McGonigal when she came on stage. But after a minute I was fully engaged and tweeting insights and pearls of wisdom from her presentation. 
 
I had missed the title of her presentation, but Jane was already throwing out fascinating data points on game playing. Now you have to understand, game playing to me is that thing my son does to avoid doing his homework. I haven't thought deeply about games since I built two animated game simulations on an Apple II to teach people business in my final year of university back in '84 (now I'm dating myself).
 
"We've invested 400,000 years playing Angry Birds" - Jane is on a roll now. I'm thinking "oh my, I too had contributed a few of those hours." Before giving it up as a colossal waste of time of course. I didn't know it, but apparently I was suffering from what Clive Thompson calls "gamers regret".
 
Read more

Oracle Delivers On SPARC Promises

Richard Fichera

Background

When I returned to Forrester in mid-2010, one of the first blog posts I wrote was about Oracle’s new roadmap for SPARC and Solaris, catalyzed by numerous client inquiries and other interactions in which Oracle’s real level of commitment to future SPARC hardware was the topic of discussion. In most cases I could describe the customer mood as skeptical at best, and panicked and committed to migration off of SPARC and Solaris at worst. Nonetheless, after some time spent with Oracle management, I expressed my improved confidence in the new hardware team that Oracle had assembled and their new roadmap for SPARC processors after the successive debacles of the UltraSPARC-5 and Rock processors under Sun’s stewardship.

Two and a half years later, it is obvious that Oracle has delivered on its commitments regarding SPARC and is continuing its investments in SPARC CPU and system design as well as its Solaris OS technology. The latest evolution of SPARC technology, the SPARC T5 and the soon-to-be-announced M5, continue the evolution and design practices set forth by Oracle’s Rick Hetherington in 2010 — incremental evolution of a common set of SPARC cores, differentiation by variation of core count, threads and cache as opposed to fundamental architecture, and a reliable multi-year performance progression of cores and system scalability.

Geek Stuff – New SPARC Hardware

Read more

Design Thinking Blurs The Line Between Process And Experience Design

Clay Richardson

Lately, I have become a bit obsessed with evaluating the linkage between good process design and good experience design. This obsession was initially sparked by primary research I led earlier this year around reinventing and redesigning business processes for mobile. The mobile imperative is driving a laser focus for companies to create exceptional user experiences for their customers, employees, and partners. But this laser focus on exceptional design is not only reshaping the application development world. This drive for exceptional user experience is also radically changing the way companies approach business process design.

Over the past six months, I have run across more and more BPM teams where user experience is playing a much larger role in driving business process change.   Some of these teams highlighted that they see experience design playing a greater role in driving process change than the actual process modeling and analysis aspects of process improvement.

Read more

HP’s Troubles Continue, But Does It Matter?

Richard Fichera

HP seems to be on a tear, bouncing from litigation with one of its historically strongest partners to multiple CEOs in the last few years, continued layoffs, and a recent massive write-down of its EDS purchase. And, as we learned last week, the circus has not left town. The latest “oops” is an $8.8 billion write-down for its purchase of Autonomy, under the brief and ill-fated leadership of Léo Apotheker, combined with allegations of serious fraud on the part of Autonomy during the acquisition process.

The eventual outcome of this latest fiasco will be fun to watch, with many interesting sideshows along the way, including:

  • Whose fault is it? Can they blame it on Léo, or will it spill over onto Meg Whitman, who was on the board and approved it?
  • Was there really fraud involved?
  • If so, how did HP miss it? What about all the internal and external people involved in due diligence of this acquisition? I’ve been on the inside of attempted acquisitions at HP, and there were always many more people around with the power to say “no” than there were people who were trying to move the company forward with innovative acquisitions, and the most persistent and compulsive of the group were the various finance groups involved. It’s really hard to see how they could have missed a little $5 billion discrepancy in revenues, but that’s just my opinion — I was usually the one trying to get around the finance guys. :)
Read more

IBM Raises The CPU Technology Bar With POWER7+

Richard Fichera

Nathan Bedford Forrest, a Confederate general of despicable ideology and consummate tactics, spoke of “keepin up the skeer,” applying continued pressure to opponents to prevent them from regrouping and counterattacking. POWER7+, the most recent version of IBM’s POWER architecture, anticipated as a follow-up to the POWER7 for almost a year, was finally announced this week, and appears to be “keepin up the skeer” in terms of its competitive potential for IBM POWER-based systems. In short, it is a hot piece of technology that will keep existing IBM users happy and should help IBM maintain its impressive momentum in the Unix systems segment.

For the chip heads, the CPU is implemented in a 32 NM process, the same as Intel’s upcoming Poulson, and embodies some interesting evolutions in high-end chip design, including:

  • Use of DRAM instead of SRAM — IBM has pioneered the use of embedded DRAM (eDRAM) as embedded L3 cache instead of the more standard and faster SRAM. In exchange for the loss of speed, eDRAM requires fewer transistors and lower power, allowing IBM to pack a total of 80 MB (a lot) of shared L3 cache, far more than any other product has ever sported.
Read more

AMD Acquires SeaMicro — Big Bet On Architectural Shift For Servers

Richard Fichera

[For some reason this has been unpublished since April — so here it is well after AMD announced its next spin of the SeaMicro product.]

At its recent financial analyst day, AMD indicated that it intended to differentiate itself by creating products that were advantaged in niche markets, with specific mention, among other segments, of servers, and to generally shake up the trench warfare that has had it on the losing side of its lifelong battle with Intel (my interpretation, not AMD management’s words). Today, at least for the server side of the business, it made a move that can potentially offer it visibility and differentiation by acquiring innovative server startup SeaMicro.

SeaMicro has attracted our attention since its appearance (blog post 1, blog post 2) with its innovative architecture that dramatically reduces power and improves density by sharing components like I/O adapters, disks, and even BIOS over a proprietary fabric. The irony here is that SeaMicro came to market with a tight alignment with Intel, who at one point even introduced a special dual-core packaging of its Atom CPU to allow SeaMicro to improve its density and power efficiency. Most recently SeaMicro and Intel announced a new model that featured Xeon CPUs to address the more mainstream segments that were not a part of SeaMicro’s original Atom-based offering.

Read more

Data Center Power And Efficiency – Public Enemy #1 Or The Latest Media Punching Bag?

Richard Fichera

This week, the New York Times ran a series of articles about data center power use (and abuse) “Power, Pollution and the Internet” (http://nyti.ms/Ojd9BV) and “Data Barns in a Farm Town, Gobbling Power and Flexing Muscle” (http://nyti.ms/RQDb0a). Among the claims made in the articles were that data centers were “only using 6 to 12 % of the energy powering their servers to deliver useful computation. Like a lot of media broadsides, the reality is more complex than the dramatic claims made in these articles. Technically they are correct in claiming that of the electricity going to a server, only a very small fraction is used to perform useful work, but this dramatic claim is not a fair representation of the overall efficiency picture. The Times analysis fails to take into consideration that not all of the power in the data center goes to servers, so the claim of 6% efficiency of the servers is not representative of the real operational efficiency of the complete data center.

On the other hand, while I think the Times chooses drama over even-keeled reporting, the actual picture for even a well-run data center is not as good as its proponents would claim. Consider:

  • A new data center with a PUE of 1.2 (very efficient), with 83% of the power going to IT workloads.
  • Then assume that 60% of the remaining power goes to servers (storage and network get the rest), for a net of almost 50% of the power going into servers. If the servers are running at an average utilization of 10%, then only 10% of 50%, or 5% of the power is actually going to real IT processing. Of course, the real "IT number" is the server + plus storage + network, so depending on how you account for them, the IT usage could be as high as 38% (.83*.4 + .05).
Read more

IBM Announces Plans To Acquire Kenexa, A Talent Software And Services Company

Claire Schooley

I’m thrilled to see “people” talked about as a major focus of business. Company executives recognize that people are critical to sustainable organizational growth. Talent is now a C-level priority. People development is a responsibility of all managers and leaders, not just the HR department. Great to hear! Vendors see talent management as a hot space and are strategically lining up to meet business needs — enter IBM!

Read more

What Will The Future Of IT (And Technology) Look Like?

Tim Sheedy

At a CIO roundtable that Forrester held recently in Sydney, I presented one of my favourite slides (originally seen in a deck from my colleague Ted Schadler) about what has happened r.e. technology since January 2007 (a little over five years ago). The slide goes like this: 

Source: Forrester Research, 2012

This makes me wonder: what the next five years will hold for us? Forecasts tend to be made assuming most things remain the same – and I bet in 2007 few people saw all of these changes coming… What unforeseen changes might we see?

  • Will the whole concept of the enterprise disappear as barriers to entry disappear across many market segments?
  • Will the next generation reject the “public persona” that is typical in the Facebook generation and perhaps return to “traditional values”?
  • How will markets respond to the aging consumer in nearly every economy?
  • How will environmental concerns play out in consumer and business technology purchases and deployments?
  • How will the changing face of cities change consumer behaviors and demands?
  • Will artificial intelligence (AI) technologies and capabilities completely redefine business?
Read more

DCIM — Updates And Trends

Richard Fichera

Only a few months since I authored Forrester’s "Market Overview: Data Center Infrastructure Management Solutions," significant changes merit some additional commentary.

Vendor Drama

The major vendor drama of the “season” is the continued evolution of Schneider and Emerson’s DCIM product rollout. Since Schneider’s worldwide analyst conference in Paris last week, we now have pretty good visibility into both major vendors' strategy and products. In a nutshell, we have two very large players, both with large installed bases of data center customers, and both selling a vision of an integrated modular DCIM framework. More importantly it appears that both vendors can deliver on this promise. That is the good news. The bad news is that their offerings are highly overlapped, and for most potential customers the choice will be a difficult one. My working theory is that whoever has the largest footprint of equipment will have an advantage, and that a lot depends on the relative execution of their field marketing and sales organizations as both companies rush to turn 1000s of salespeople and partners loose on the world with these products. This will be a classic market share play, with the smart strategy being to sacrifice margin for market share, since DCIM solutions have a high probability of pulling through services, and usually involve some annuity revenue stream from support and update fees.

How Big Is The Market?

Read more