Hybrid clouds are especially subject to the law of unintended consequences, says Forrester’s cloud expert James Staten. Many IT organizations don’t even acknowledge that they have a hybrid cloud. The reality: If enterprises are using public cloud software-as-a-service (SaaS) and/or deploying any custom applications in the public cloud, then by definition they have a hybrid cloud, because it almost always connects to the back end.
In this episode of TechnoPolitics, James implores CIOs and IT professionals to get serious about hybrid cloud now to avoid spaghetti clouds in the future.
Over the last couple of years, I've fielded a number of inquiries from Forrester clients who are trying to decide whether their company should move their email and other collaboration workloads into the cloud via Google Apps for Business or Microsoft Office 365. This conversation has gained so much momentum that I recently did a podcast with my colleague Mike Gualtieri on the subject, will host a teleconference covering the topic on February 26, and will soon publish a report detailing answers to five of the common questions that we get about online collaboration and productivity suites (which include Office 365, Google Apps, and IBM SmartCloud for Social Business). Fueling this extended conversation are business and IT leaders' deliberations over one question: Is there a right or wrong in selecting one vendor's offering over the other? I'll use a typical analyst hedge to answer: It depends.
It's not controversial that business success today depends more than ever on IT performance. Business processes and IT operations are highly interdependent and tightly linked. Alignment between the two is no longer an option—it’s a requirement to stay competitive. Your business customers won’t succeed in today’s dynamic economy without IT behind them, but business customers care about outcomes, not technologies. The more you can think like they do, the better your relationship will be, the better your outcomes will be, and frankly, the better your future job prospects will be.
Forrester calls the evolution of IT from a provider of technologies to a broker of business services the “IT to BT (business technology) transformation.” Key to this shift is rethinking IT’s role in the enterprise and, in particular, rethinking current IT processes and the tools used to support them. Many IT organizations have improved workload, application release, run-book, data transfer, and virtual machine management processes, to name a few, through automation—yet still fail to deliver the agility and responsiveness their business customers demand.
Today’s announcements at the Open Compute Project (OCP) 2013 Summit could be considered as tangible markers for the OCP crossing the line into real relevance as an important influence on emerging hyper-scale and cloud computing as well as having a potential bleed-through into the world of enterprise data centers and computing. This is obviously a subjective viewpoint – there is no objective standard for relevance, only post-facto recognition that something was important or not. But in this case I’m going to stick my neck out and predict that OCP will have some influence and will be a sticky presence in the industry for many years.
Even if their specs (which look generally quite good) do not get picked up verbatim, they will act as an influence on major vendors who will, much like the auto industry in the 1970s, get the message that there is a market for economical “low-frills” alternatives.
Major OCP Initiatives
To date, OCP has announced a number of useful hardware specifications, including:
With a couple of months' perspective, I’m pretty convinced that Intel has made a potentially disruptive entry in the market for programmable computational accelerators, often referred to as GPGPUs (General Purpose Graphics Processing Units) in deference to the fact that the market leaders, NVIDIA and AMD, have dominated the segment with parallel computational units derived from high-end GPUs. In late 2012, Intel, referring to the architecture as MIC (Many Independent Cores) introduced the Xeon Phi product, the long-awaited productization of the development project that was known internally (and to the rest of the world as well) as Knight’s Ferry, a MIC coprocessor with up to 62 modified Xeon cores implemented in its latest 22 nm process.
The Forrester team of Asia Pacific (AP) analysts has just published its 2013 IT industry predictions. Below is a sneak peek at some key regional trends I wanted to highlight.
2013 will be a transformative year for IT adoption in AP, as multiple IT trends converge to drive industry disruptions and help spur renewed growth in IT spending. Forrester expects IT spending in AP to rebound in 2013, with regionwide growth of 4% — rising to 8% when the large but slow-growing Japan market is excluded. While India IT spending growth will remain sluggish, the 2012 economic slowdown in China will be short-lived as government stimulus policies take effect in 2013. The Australia, New Zealand, and ASEAN markets will all remain resilient, with Vietnam, Indonesia, and the Philippines leading the way in IT spending growth.
Below are some other key predictions shaping the Asia Pacific IT industry in 2013:
End user computing strategies will be limited to mobile device management (MDM). AP organizations are feeling the pressure to deliver applications and services across multiple devices, including traditional desktops/laptops, smartphones, and tablets. But lack of skills will hinder bring-your-own-technology (BYOT) policies, which will remain limited to MDM, including basic device control and security/identity management.
The 2013 New Year has begun with the removal from the global tech market outlook of one risk, that of the US economy going over the fiscal cliff. On New Year's day, the US House of Representatives followed the lead of the US Senate and passed a bill that extends existing tax rates for households with $450,000 or less in income, extends unemployment insurance benefits for 2 million Americans, and renews tax credits for child care, college tuition, and renewable energy production, as well as delaying for two months the automatic spending cuts. While it also allowed Social Security payroll taxes to rise by 2 percentage points — thereby raising the tax burden on poor and middle class people — and did not increase the federal debt ceiling or address entitlement spending, the last-minute compromise does mean that the US tech market no longer has to worry, for now, about big increases in taxes and cuts in spending pushing the US economy into recession.
Amazon Web Services (AWS) held its first global customer and partner conference, re:Invent, in late November in Las Vegas, attracting approximately 6,000 attendees. While aimed squarely at developers, AWS highlighted two key themes that will appeal directly to enterprise IT decision-makers:
Continued global expansion. AWS cites customers in 190 countries, but the company is clearly pushing for greater penetration into enterprise accounts via aggressive global expansion. AWS now has nine regions (each of which has at least one data center), including three in Asia Pacific: Tokyo, Singapore, and Sydney.
An expanded services footprint within customer accounts. The major announcement at re:Invent was a limited preview of a new data warehouse (DW) service called Amazon Redshift — a fully managed, cloud-based, petabyte-scale DW. As my colleague Stefan Ried tweeted during the event, with a limit of 1.6 petabytes, this is not just for testing and development — this is a serious production warehouse.
Forrester cloud computing expert James Staten recently published 10 Cloud Predictions For 2013 with contributions from nine other analysts, including myself. The prediction that is near and dear to my heart is #10: "Developers will awaken to: development isn't all that different in the cloud," That's right, it ain't different. Not much anyway. Sure. It can be single-click-easy to provision infrastructure, spin up an application platform stack, and deploy your code. Cloud is great for developers. And Forrester's cloud developer survey shows that the majority of programming languages, frameworks, and development methodologies used for enterprise application development are also used in the cloud.
Forget Programming Language Charlatans
Forget the vendors and programming language charlatans that want you to think the cloud development is different. You already have the skills and design sensibility to make it work. In some cases, you may have to learn some new APIs just like you have had to for years. As James aptly points out in the post: "What's different isn't the coding but the services orientation and the need to configure the application to provide its own availability and performance. And, frankly this isn't all that new either. Developers had to worry about these aspects with websites since 2000." The best cloud vendors make your life easier, not different.
So what does VMware and EMC’s announcement of the new Pivotal Initiative mean for I&O leaders? Put simply, it means the leading virtualization vendor is staying focused on the data center — and that’s good news. As many wise men have said, the best strategy comes from knowing what NOT to do. In this case, that means NOT shifting focus too fast and too far afield to the cloud.
I think this is a great move, and makes all kinds of sense to protect VMware’s relationship with its core buyer, maintain focus on the datacenter, and lay the foundation for the vendor’s software-defined data center strategy. This move helps to end the cloud-washing that’s confused customers for years: There’s a lot of work left to do to virtualize the entire data center stack, from compute to storage and network and apps, and the easy apps, by now, have mostly been virtualized. The remaining workloads enterprises seek to virtualize are much harder: They don’t naturally benefit from consolidation savings, they are highly performance sensitive, and they are much more complex.