According to CRN’s article on the event, Gelsinger was quoted as saying, “"We want to own corporate workloads. We all lose if they end up in these commodity public clouds. We want to extend our franchise from the private cloud into the public cloud and uniquely enable our customers with the benefits of both. Own the corporate workload now and forever."
Forgive my frankness, Mr. Gelsinger, but you just don’t get it. Public clouds are not your enemy. And the disruption they are causing to your forward revenues are not their capture of enterprise workloads. The battle lines you should be focusing on are between advanced virtualization and true cloud services and the future placement of Systems of Engagement versus Systems of Record.
You've told your ITOps team to make it happen, you've approved the purchase of cloud-in-a-box solutions, but your developers aren't using it. Why?
Forrester analyst Lauren Nelson and myself get this question often in our inquiries with enterprise customers and we've found the answer and published a new report specifically on this topic.
Its core finding: Your approach is wrong.
You're asking the wrong people to build the solution. You aren't giving them clear enough direction on what they should build. You aren't helping them understand how this new service should operate or how it will affect their career and value to the organization. And more often than not you are building the private cloud without engaging the buyers who will consume this cloud.
And your approach is perfectly logical. For many of us in IT, we see a private cloud as an extension of our investments in virtualization. It's simply virtualization with some standardization, automation, a portal, and an image library isn't it? Yep. And a Porsche is just a Volkswagen with better engine, tires, suspension, and seats. That's the fallacy in this thinking.
To get private cloud right, you have to step away from the guts of the solution and start with the value proposition. From the point of view of the consumers of this service — your internal developers and business users.
Now that we’ve been back from the holidays for a month, I’d like to round out the 2013 predictions season with a look at the year ahead in server virtualization. If you’re like me (or this New York Times columnist), you’ll agree that a little procrastination can sometimes be a good thing to help collect and organize your plans for the year ahead. (Did you buy that rationalization?)
We’re now more than a decade into the era of widespread x86 server virtualization. Hypervisors are certainly a mature (if not peaceful) technology category, and the consolidation benefits of virtualization are now uncontestable. 77% of you will be using virtualization by the end of this year, and you’re running as many as 6 out of 10 workloads in virtual machines. With such strong penetration, what’s left? In our view: plenty. It’s time to ask your virtual infrastructure, “What have you done for me lately?”
With that question in mind, I asked my colleagues on the I&O team to help me predict what the year ahead will hold. Here are the trends in 2013 you should track closely:
Consolidation savings won’t be enough to justify further virtualization. For most I&O pros, the easy workloads are already virtualized. Looking ahead at 2013, what’s left are the complex business-critical applications the business can’t run without (high-performance databases, ERP, and collaboration top the list). You won’t virtualize these to save on hardware; you’ll do it to make them mobile, so they can be moved, protected, and duplicated easily. You’ll have to explain how virtualizing these apps will make them faster, safer, and more reliable—then prove it.
It looks that EMC has finally admitted it needs a better approach for courting developers and is doing something significant to fix this. No longer will key assets like Greenplum, Pivotal, or Spring flounder in a corporate culture dominated by infrastructure thinking and selling.
After months of rumors about a possible spin-out going unaddressed, EMC pulled the trigger today, asking Terry Anderson, its VP of Corporate Communications, to put out an official acknowledgement on one of it its blogs (a stealthy, investor-relations-centric move) of its plans to aggregate its cloud and big data assets and give them concentrated focus. It didn't officially announce a spin out or even the creation of a new division. Nor did it clearly identify the role former VMware CEO Paul Maritz will play in this new gathering. But it did clarify what assets would be pushed into this new group:
So what does VMware and EMC’s announcement of the new Pivotal Initiative mean for I&O leaders? Put simply, it means the leading virtualization vendor is staying focused on the data center — and that’s good news. As many wise men have said, the best strategy comes from knowing what NOT to do. In this case, that means NOT shifting focus too fast and too far afield to the cloud.
I think this is a great move, and makes all kinds of sense to protect VMware’s relationship with its core buyer, maintain focus on the datacenter, and lay the foundation for the vendor’s software-defined data center strategy. This move helps to end the cloud-washing that’s confused customers for years: There’s a lot of work left to do to virtualize the entire data center stack, from compute to storage and network and apps, and the easy apps, by now, have mostly been virtualized. The remaining workloads enterprises seek to virtualize are much harder: They don’t naturally benefit from consolidation savings, they are highly performance sensitive, and they are much more complex.
On Tuesday, September 4, Microsoft made the official announcement of Windows Server 2012, ending what has seemed like an interminable sequence of rumors, Beta releases, and endless speculation about this successor to Windows Server 2008.
So, is it worth the wait and does it live up to its hype? All omens point to a resounding “YES.”
Make no mistake, this is a really major restructuring of the OS, and a major step-function in capabilities aligned with several major strategic trends for both Microsoft and the rest of the industry. While Microsoft’s high level message is centered on the cloud, and on the Windows Server 2012 features that make it a productive platform upon which both enterprises and service providers can build a cost-effective cloud, its features will be immensely valuable to a wide range of businesses.
What It Does
The reviewers guide for Windows Server 2012 is over 220 pages long, and the OS has at least 100 features that are worth noting, so a real exploration of the features of this OS is way beyond what I can do here. Nonetheless, we can look at several buckets of technology to get an understanding of the general capabilities. Also important to note is that while Microsoft has positioned this as a very cloud-friendly OS, almost all of these cloud-related features are also very useful to an enterprise IT environment.
New file system — Included in WS2012 is ReFS, a new file system designed to survive failures that would bring down or corrupt the previous NTFS file system (which is still available). Combined with improvements in cluster management and failover, this is a capability that will play across the entire user spectrum.
The most notable news to come out of the VMworld conference last week was the coronation of Pat Gelsinger as the new CEO of VMware. His tenure officially started over the weekend, on September 1, to be exact.
For those who don’t know Pat’s career, he gained fame at Intel as the personification of the x86 processor family. It’s unfair to pick a single person as the father of the modern x86 architecture, but if you had to pick just one person, it’s probably Pat. He then grew to become CTO, and eventually ran the Digital Enterprise Group. This group accounted for 55% of Intel’s US$37.586B in revenue according to its 2008 annual report, the last full year of Pat’s tenure. EMC poached him from Intel in 2009, naming him president of the Information Infrastructure Products group. EMC’s performance since then has been very strong, with a 17.5% YoY revenue increase in its latest annual report. Pat’s group contributed 53.7% of that revenue. While he’s a geek at heart (his early work), he proved without a doubt that he also has the business execution chops (his later work). Both will serve him well at VMware, especially the latter.
End User Computing is at the Root of the VMware Family Tree
Examine the roots of the VMware family tree, and End User Computing is the longest root of 'em all. It's where it all began, back in 1999 with a cool little product that let me run Windows on top of Linux. It was like magic for software customer demos of complex enterprise apps. I could royally screw up a demo environment an hour before a demo for a $15M deal by adding just one field to the screen that the customer demanded to see, but instead of soiling my underwear in a panic, I could go back to my most recently saved state of less than an hour before. Brilliant! It was a tool for me to be more effective in my job. Hold that thought.
So with this heritage in mind and a general respect for VMware's products honed over the past 15 years of growth and change, and fantastic tools for I&O professionals to manage virtualized environments with, I was delighted to see End User Computing be the focus of general session demos and breakout sessions. I was looking forward to learning more about Wanova Mirage to see if it could help on the employee freedom and personal innovation front. Those of you following this space know what I think of what I like to call Soviet Bloc Virtual Desktop Infrastructures.
Virtuosity as the Root of Innovation and the Dangers of Hosted VDI
The long-rumored changing of the guard at VMware finally took place last week and with it came down a stubborn strategic stance that was a big client dis-satisfier. Out went the ex-Microsoft visionary who dreamed of delivering a new "cloud OS" that would replace Windows Server as the corporate standard and in came a pragmatic refocusing on infrastructure transformation that acknowledges the heterogeneous reality of today's data center.
Paul Maritz will move into a technology strategy role at EMC where he can focus on how the greater EMC company can raise its relevance with developers. Clearly, EMC needs developer influence and application-level expertise, and from a stronger, full-portfolio perspective. Here, his experience can be more greatly applied -- and we expect Paul to shine in this role. However, I wouldn't look to see him re-emerge as CEO of a new spin out of these assets. At heart, Paul is more a natural technologist and it's not clear all these assets would move out as one anyway.
Only a few months since I authored Forrester’s "Market Overview: Data Center Infrastructure Management Solutions," significant changes merit some additional commentary.
The major vendor drama of the “season” is the continued evolution of Schneider and Emerson’s DCIM product rollout. Since Schneider’s worldwide analyst conference in Paris last week, we now have pretty good visibility into both major vendors' strategy and products. In a nutshell, we have two very large players, both with large installed bases of data center customers, and both selling a vision of an integrated modular DCIM framework. More importantly it appears that both vendors can deliver on this promise. That is the good news. The bad news is that their offerings are highly overlapped, and for most potential customers the choice will be a difficult one. My working theory is that whoever has the largest footprint of equipment will have an advantage, and that a lot depends on the relative execution of their field marketing and sales organizations as both companies rush to turn 1000s of salespeople and partners loose on the world with these products. This will be a classic market share play, with the smart strategy being to sacrifice margin for market share, since DCIM solutions have a high probability of pulling through services, and usually involve some annuity revenue stream from support and update fees.