As someone who has been covering cloud computing since the dawn of Amazon’s Elastic Compute Cloud (EC2) I’m constantly in education mode about what is and isn’t cloud computing. To borrow an analogy from my Forrester colleague Ted Schadler’s keynote at last year’s IT Forum, the challenge is a lot like helping blind men discern an elephant through just the parts of the animal they can reach. One feels the trunk and declares it a cylindrical, yet hairy and warm snake. The other calls it a strong, tough and deeply rooted tree upon feeling its hind leg. Each examiner brings their own experience and context to the challenge as well as their own judgments, then leaps to the conclusion that best fits their desires.
There’s an old adage that the worst running car in the neighborhood belongs to the auto mechanic. Why? Because they like to tinker with it. We as IT pros love building and tinkering with things, too, and at one point we all built our own PC and it probably ran about as well as the mechanic's car down the street.
While the mechanic’s car never ran that well, it wasn’t a reflection on the quality of his work on your car because he drew the line between what he can tinker with and what can sink him as a professional (well, most of the time). IT pros do the same thing. We try not to tinker with computers that will affect our clients or risk the service level agreement we have with them. Yet there is a tinkerer’s mentality in all of us. This mentality is evidenced in our data centers, where the desire to configure our own infrastructure and build out our own best-of-breed solutions has resulted in an overly complex mish-mash of technologies, products and management tools. There’s lots of history behind this mess and lots of good intentions, but nearly everyone wants a cleaner way forward.
In the vendors’ minds, this way forward is clearly one that has more of their stuff inside and the latest thinking here is the new converged infrastructure solutions they are marketing, such as HP’s BladeSystem Matrix and IBM’s CloudBurst. Each of these products is the vendor’s vision of a cleaner, more integrated and more efficient data center. And there’s a lot of truth to this in what they have engineered. The big question is whether you should buy into this vision.
Every spring I’m faced with the wonderful opportunity – and challenge – of choosing the best questions for Forrester's annual 20 minute Web survey of commercial buyers of IT infrastructure and hardware across North America and Europe.
As technology industry strategists, what themes or hypotheses in IT infrastructure do you think we should focus on? What are the emerging topics with the potential for large, long term consequences, such as cloud computing, that you’d like to see survey data on? Please offer your suggestions in the comments below by May 21!
This year, I’m proposing the following focus areas for the survey:
New client system deployment strategies– virtual desktops, bring-your-own-PC, Win 7, smartphones, and tablets
Hypothesis: Early adopters are embracing virtual desktops and bring-your-own-PC, but the mainstream will proceed with standard Win 7 deployments
VMware And salesforce.com Join Forces To Push PaaS To Mainstream Adoption With vmforce
salesforce.com and VMware announced today the development of a joint product and service offering named vmforce. Forrester had a chance to talk to executives at both companies prior to the announcement, and I am quite impressed by the bold move of the two players. Most developers in corporate environments and ISVs perceive the two stacks as two totally different alternatives when selecting a software platform. While the VMware stack, with its Tomcat-based Spring framework, reached mainstream popularity among Java developers with its more lightweight standard Java approach, salesforce.com’s Force.com stack was mostly attractive to developers who liked to extend CRM packaged apps with individual business logic or to ISVs that created new applications from scratch. In some cases, the Java standard and the more proprietary APEX language at Force.com even appeared as competitive options.
Like many movements before it, IT is rapidly evolving to an industrial model. A process or profession becomes industrialized when it matures from an art form to a widespread, repeatable function with predictable result and accelerated by technology to achieve far higher levels of productivity. Results must be deterministic (trustworthy) and execution must be fast and nimble, two related but different qualities. Customer satisfaction need not be addressed directly because reliability and speed result in lower costs and higher satisfaction.
IT should learn from agriculture and manufacturing, which have perfected industrialization. In agriculture, productivity is orders of magnitude better. Genetic engineering made crops resistant to pests and environmental extremes such as droughts while simultaneously improving consistency. The industrialized evolution of farming means we can feed an expanding population with fewer farmers. It has benefits in nearly every facet of agricultural production.
Manufacturing process improvements like the assembly line and just-in-time manufacturing combined with automation and statistical quality control to ensure that we can make products faster and more consistently, at a lower cost. Most of the products we use could not exist without an industrialized model.
Smoke and fire is all around you, the sound of the alarm makes you dizzy and people are running in panic to escape the inferno while you have to find your way to safety. This is not a scene in the latest video game but actually training for e.g. field engineers in an exact virtual copy of a real world environment such as oil platforms or manufacturing plants.
In a recent discussion with VRcontext, a company based in Brussels and specialized since 10 years in asset virtualization, I was fascinated by the possibilities to create virtual copies of real world large, extremely complex assets simply from scanning existing CAD plans or on-site laser scans. It’s not just the 3D virtualization but the integration of the virtual world with Enterprise Asset Management (EAM), ERP, LIMS, P&ID and other systems that allows users to track, identify and locate every single piece of equipment in the real and virtual world.
These solutions are used today for safety training simulations as well as to increase operational efficiency e.g. in asset maintenance processes. There are still areas for further improvements, like the integration of RFID tags or sensor readings. However, as the technology further matures I can see future use cases all over the place – from the virtualization of any kind of location that is difficult or dangerous to enter to simple office buildings for a ‘company campus tour’ or a ‘virtual meeting’. And it doesn’t require super-computing power – it all runs on low-spec, ‘standard’ PCs and the models are only taking few GBytes storage.
So if you are bored of running around in Second Life or World Of Warcraft, if you ever have the chance, exchange your virtual sword for a wrench and visit the ‘real’ virtual world of a fascinating oil rig or refinery.
There are more hindrances to AMD’s ability to penetrate the market with its Opteron CPUs; and Intel’s not a fault this time. In an earlier blog post on the AMD-Intel settlement I brought up an example of a type of incompatibility that exists between the two CPU makers that isn’t covered by the settlement – live migration of virtual machines. There’s more to this story.
CA is a vendor that already enjoys a leading position in overall network management. Its 2005 acquisition of Concord, which brought along the assets of the previously acquired Aprisma, instantly moved CA from an also-ran to one of the clear leaders. Concord was good, and CA has an impressive track record of growing that business since the acquisition. Still, there were some weaknesses with regard to more advanced performance analysis.
On September 14, 2009, CA finally addressed these performance gaps by announcing its intent to acquire NetQoS for $200 million. Based in Austin, TX, NetQoS is one of those exciting small companies that proved there is a better approach to many of the challenges we face. It is one of the true innovators in performance management of both infrastructure and applications.
EMC continues to tease the market with its management software ambitions, taking another step this week to build on its portfolio. On May 27, EMC announced its intent to acquire Configuresoft, a vendor of server configuration and change management (CCM) software. Forrester views this as a positive development for both companies but we eagerly await more.