Forrester cloud computing expert James Staten recently published 10 Cloud Predictions For 2013 with contributions from nine other analysts, including myself. The prediction that is near and dear to my heart is #10: "Developers will awaken to: development isn't all that different in the cloud," That's right, it ain't different. Not much anyway. Sure. It can be single-click-easy to provision infrastructure, spin up an application platform stack, and deploy your code. Cloud is great for developers. And Forrester's cloud developer survey shows that the majority of programming languages, frameworks, and development methodologies used for enterprise application development are also used in the cloud.
Forget Programming Language Charlatans
Forget the vendors and programming language charlatans that want you to think the cloud development is different. You already have the skills and design sensibility to make it work. In some cases, you may have to learn some new APIs just like you have had to for years. As James aptly points out in the post: "What's different isn't the coding but the services orientation and the need to configure the application to provide its own availability and performance. And, frankly this isn't all that new either. Developers had to worry about these aspects with websites since 2000." The best cloud vendors make your life easier, not different.
So what does VMware and EMC’s announcement of the new Pivotal Initiative mean for I&O leaders? Put simply, it means the leading virtualization vendor is staying focused on the data center — and that’s good news. As many wise men have said, the best strategy comes from knowing what NOT to do. In this case, that means NOT shifting focus too fast and too far afield to the cloud.
I think this is a great move, and makes all kinds of sense to protect VMware’s relationship with its core buyer, maintain focus on the datacenter, and lay the foundation for the vendor’s software-defined data center strategy. This move helps to end the cloud-washing that’s confused customers for years: There’s a lot of work left to do to virtualize the entire data center stack, from compute to storage and network and apps, and the easy apps, by now, have mostly been virtualized. The remaining workloads enterprises seek to virtualize are much harder: They don’t naturally benefit from consolidation savings, they are highly performance sensitive, and they are much more complex.
This week Wal-Mart announced that it would put significant weight behind the new Boxee TV box, a $99 set-top box that competes with the market-leading Apple TV and the runner-up Roku boxes. Wal-Mart also sells the Apple TV and Roku devices, so it might not seem like a big deal, but it is. Because Wal-Mart is going to promote Boxee TV with in-store displays and outbound marketing support. Why? Because in addition to the regular apps like Hulu, Netflix, and the rest, Boxee gives Wal-Mart customers three things they can't get from Apple or Roku:
Regular TV shows from local broadcasters. Boxee's new box has a digital tuner that lets you tune to digital signals from ABC, CBS, CW, Fox, NBC, PBS, and Univision through either an over-the-air antenna or via ClearQAM.
Unlimited DVR. Not only will Boxee let you watch these channels, it is offering unlimited cloud DVR for $9.99 a month (in only the top eight markets for now) to record any shows from those networks, without managing a hard drive or paying extra if you want to store hours and hours of video.
Multidevice viewing. This is the real coup for Boxee. Because its DVR is in the cloud, it can send your recorded content to any device you log in to -- whether it's in your home or in your hands while traveling for business.
Nathan Bedford Forrest, a Confederate general of despicable ideology and consummate tactics, spoke of “keepin up the skeer,” applying continued pressure to opponents to prevent them from regrouping and counterattacking. POWER7+, the most recent version of IBM’s POWER architecture, anticipated as a follow-up to the POWER7 for almost a year, was finally announced this week, and appears to be “keepin up the skeer” in terms of its competitive potential for IBM POWER-based systems. In short, it is a hot piece of technology that will keep existing IBM users happy and should help IBM maintain its impressive momentum in the Unix systems segment.
For the chip heads, the CPU is implemented in a 32 NM process, the same as Intel’s upcoming Poulson, and embodies some interesting evolutions in high-end chip design, including:
Use of DRAM instead of SRAM — IBM has pioneered the use of embedded DRAM (eDRAM) as embedded L3 cache instead of the more standard and faster SRAM. In exchange for the loss of speed, eDRAM requires fewer transistors and lower power, allowing IBM to pack a total of 80 MB (a lot) of shared L3 cache, far more than any other product has ever sported.
[For some reason this has been unpublished since April — so here it is well after AMD announced its next spin of the SeaMicro product.]
At its recent financial analyst day, AMD indicated that it intended to differentiate itself by creating products that were advantaged in niche markets, with specific mention, among other segments, of servers, and to generally shake up the trench warfare that has had it on the losing side of its lifelong battle with Intel (my interpretation, not AMD management’s words). Today, at least for the server side of the business, it made a move that can potentially offer it visibility and differentiation by acquiring innovative server startup SeaMicro.
SeaMicro has attracted our attention since its appearance (blog post 1, blog post 2) with its innovative architecture that dramatically reduces power and improves density by sharing components like I/O adapters, disks, and even BIOS over a proprietary fabric. The irony here is that SeaMicro came to market with a tight alignment with Intel, who at one point even introduced a special dual-core packaging of its Atom CPU to allow SeaMicro to improve its density and power efficiency. Most recently SeaMicro and Intel announced a new model that featured Xeon CPUs to address the more mainstream segments that were not a part of SeaMicro’s original Atom-based offering.
On Tuesday, September 4, Microsoft made the official announcement of Windows Server 2012, ending what has seemed like an interminable sequence of rumors, Beta releases, and endless speculation about this successor to Windows Server 2008.
So, is it worth the wait and does it live up to its hype? All omens point to a resounding “YES.”
Make no mistake, this is a really major restructuring of the OS, and a major step-function in capabilities aligned with several major strategic trends for both Microsoft and the rest of the industry. While Microsoft’s high level message is centered on the cloud, and on the Windows Server 2012 features that make it a productive platform upon which both enterprises and service providers can build a cost-effective cloud, its features will be immensely valuable to a wide range of businesses.
What It Does
The reviewers guide for Windows Server 2012 is over 220 pages long, and the OS has at least 100 features that are worth noting, so a real exploration of the features of this OS is way beyond what I can do here. Nonetheless, we can look at several buckets of technology to get an understanding of the general capabilities. Also important to note is that while Microsoft has positioned this as a very cloud-friendly OS, almost all of these cloud-related features are also very useful to an enterprise IT environment.
New file system — Included in WS2012 is ReFS, a new file system designed to survive failures that would bring down or corrupt the previous NTFS file system (which is still available). Combined with improvements in cluster management and failover, this is a capability that will play across the entire user spectrum.
I’ve been speaking to more and more clients lately who are not just saving money with cloud computing — they’re using the principles of the cloud to completely transform how they source, build, and deliver all IT services. Savvy I&O leaders should look beyond the per-hour savings promised by the cloud to the core tenets of cloud computing itself. How do the public clouds do it? Why can’t you?
Well, you can. You can transform your IT operating model from that of widget-provider to a true service-oriented business partner. Forrester writes extensively about how to make the IT to BT (business technology) transition. I recently spoke at length with the IT management team at Commonwealth Bank of Australia (CBA) about their multi-year IT transformation to what they call “everything-as-a-service.” I was put in touch with them by one of their primary suppliers, cloud service management and automation vendor ServiceMesh.
We’ll be publishing a complete case study soon, but I wanted to share some of the basics here because they outline a strategy anyone can achieve, regardless of your current level of cloud maturity. The bank started by establishing six core tenets to be enforced across all I&O services moving forward, whether hosted internally or externally. These guiding principles neatly summarize the core value dimensions of cloud computing itself:
Pay as you go. Business customers only pay for products and services actually used, on a metered, charge-back basis, under flexible service agreements, as opposed to fixed-term contracts.
With VMworld in full swing this week and Microsoft’s cloud-centered Windows Server 2012 launching soon after, your options for technology to build and deploy enterprise clouds is about to expand significantly. Meanwhile, Amazon continues to drop prices faster than your local Wal-Mart, introduce new cloud compute and storage services almost monthly, and has already gobbled up a trillion objects in S3. Is it time to start moving your workloads to the cloud?
Forrsights surveys show that companies are indeed moving to the cloud, primarily for speed and lower costs — but are the savings really there? The answer might not be obvious. Are you heavily virtualized already? Have you moved up the virtualization value chain beyond server consolidation to using virtual machines for better disaster recovery, less downtime, automated configuration management, and the like? Do you have a virtual-first policy and actively share resources across business units? If you run a mature virtual environment today, your internal infrastructure costs might already be competitive with the cloud.
Cloud Services Offer New Opportunities For Big Data Solutions
What’s better than writing about one hot topic? Well, writing about two hot topics in one blog post — and here you go:
The State Of BI In The Cloud
Over the past few years, BI business intelligence (BI) was the overlooked stepchild of cloud solutions and market adoption. Sure, some BI software-as-a-service (SaaS) vendors have been pretty successful in this space, but it was success in a niche compared with the four main SaaS applications: customer relationship management (CRM), collaboration, human capital management (HCM), and eProcurement. While those four applications each reached cloud adoption of 25% and more in North America and Western Europe, BI was leading the field of second-tier SaaS solutions used by 17% of all companies in our Forrester Software Survey, Q4 2011. Considering that the main challenges of cloud computing are data security and integration efforts (yes, the story of simply swiping your credit card to get a full operational cloud solution in place is a fairy tale), 17% cloud adoption is actually not bad at all; BI is all about data integration, data analysis, and security. With BI there is of course the flexibility to choose which data a company considers to run in a cloud deployment and what data sources to integrate — a choice that is very limited when implementing, e.g., a CRM or eProcurement cloud solution.
“38% of all companies are planning a BI SaaS project before the end of 2013.”
In November 2011, Atos and Yonyou (formerly Ufida) announced the creation of a joint venture dubbed Yunano™ aimed at the European SMB market. The two companies are at it again, this time focusing specifically on the Chinese domestic market. I recently met with Herbie Leung, CEO of Atos in Asia Pacific, to discuss the partnership and future market opportunities in China. This new agreement essentially covers three areas of collaboration:
Bringing PLM and MES expertise to Yonyou customers. With more than 1.5 million customers, Yonyou is one of the largest software providers in China with strengths in ERP and CRM solutions. However, the company lacks capabilities in adjacent areas like product lifecycle management (PLM) and manufacturing execution systems (MES). Following the SIS acquisition, Atos has significantly strengthened its capabilities in these domains and will offer them to Yonyou clients.
Helping Yonyou’s customers migrate to private cloud architectures. The lack of private cloud technical skills in China led Yonyou to leverage Atos’s expertise to develop private cloud assessment workshops and ERP migration services targeting the China market. Atos will in turn leverage Canopy, a company it recently created in partnership with EMC and VMware to provide cloud solutions to its clients globally.
Helping Yonyou expand into new markets in Asia. Like many Chinese companies, Yonyou has global aspirations.While theYunano joint venture focuses on bringing Yonyou’s ERP solutions to the mid-market in EMEA, the new partnership will leverage Atos go-to-market capabilities to take the Yonyou solutions to other markets in Asia.