HP today announced the Moonshot 1500 server, their first official volume product in the Project Moonshot server product family (the initial Redstone, a Calxeda ARM-based server, was only available in limited quantities as a development system), and it represents both a significant product today and a major stake in the ground for future products, both from HP and eventually from competitors. It’s initial attractions – an extreme density low power x86 server platform for a variety of low-to-midrange CPU workloads – hides the fact that it is probably a blueprint for both a family of future products from HP as well as similar products from other vendors.
Geek Stuff – What was Announced
The Moonshot 1500 is a 4.3U enclosure that can contain up to 45 plug-in server cartridges, each one a complete server node with a dual-core Intel Atom 1200 CPU, up to 8 GB of memory and a single disk or SSD device, up to 1 TB, and the servers share common power supplies and cooling. But beyond the density, the real attraction of the MS1500 is its scalable fabric and CPU-agnostic architecture. Embedded in the chassis are multiple fabrics for storage, management and network giving the MS1500 (my acronym, not an official HP label) some of the advantages of a blade server without the advanced management capabilities. At initial shipment, only the network and management fabric will be enabled by the system firmware, with each chassis having up two Gb Ethernet switches (technically they can be configured with one, but nobody will do so), allowing the 45 servers to share uplinks to the enterprise network.
Digital capability – social, mobile, cloud, data & analytics – disrupts business models, introduces new competitive threats, and places new demands on your business. Highlighting this fact: Forrester’s 2012 “Digital Readiness Assessment” survey found that 65% of global executives say they are “excited about the changes that digital tools and experiences will bring” to their company.
While most people know these digital trends are coming, however, far fewer know how to purchase these cutting-edge digital capabilities. What companies will you rely on? Where are the new risks? What are the pricing models? In the survey mentioned above, only 32% of the same sample agreed that their organization “has policies and business practices in place to adapt” to those digital changes.
This is important, since developing the breadth of digital capabilities your company needs cannot all be done in-house. To succeed, your company will need to access the strengths of its supplier ecosystem, maximize value from strategic partners, and leverage emerging supplier models.
This is a tremendous opportunity for sourcing and vendor management professionals to increase the strategic value they provide to their business. But to do this, you’ll need to balance your traditional cost-cutting goals with demands for business expectations for growth, innovation, and value.
A year and a half ago I broke up with Blackberry and started dating iPhone. It was a clean but cruel breakup: AT&T cancelled my T-Mobile contract on my behalf, the equivalent of getting dumped by your girlfriend’s new boyfriend.
This year I’ve been cheating on my laptop with my iPad. But it’s an on-again, off-again relationship. While I tell my iPad it’s the only one, I keep going back to my laptop. When I travel, my iPad is with me meeting clients. Meanwhile my laptop is in the hotel room surfing the online menu for a turkey club.
The iPad beats my laptop on size, weight, connectivity, and battery life. It also improves the human element when I’m having a face-to-face conversation but need to take notes. These are all critically important to me when I'm out of the office visiting clients or at an event.
But my laptop wins when I need to perform other important activities. For example, the larger screen really helps to write and edit research reports (John Rakowski, you’ll have your edits soon!). Or when I need to approve expenses behind the VPN or access files on my hard drive that I haven’t stored in Google Drive (yes, Forrester sanctioned).
Now that I've had a few months of compare both devices, I come back to outcomes . . .
For the vast majority of Forrester customers who I have not had the pleasure of meeting, my name is Henry Baltazar and I'm the new analyst covering Storage for the I&O team. I've covered the Storage industry for over 15 years and spent the first 9 years of my career as a Technical Analyst at eWEEK/PCWeek Labs, where I was responsible for benchmarking storage systems, servers and Network Operating Systems.
During my lab days, I tested hundreds of different products and was fortunate to witness the development and maturation of a number of key innovations such as data deduplication, WAN optimization and scale-out storage. In the technology space "Better, Faster, Cheaper - Pick Two" used to be the design goal for many innovators, and I've seen many technologies struggle to attain two, let alone three of these goals, especially in the first few product iterations. For example, while iSCSI was able to challenge Fibre Channel on the basis of being cheaper - despite being around for over a decade many storage professionals are still not convinced that iSCSI is faster or better.
Looking at storage technologies today, relative to processors and networking, storage has not held up its end of the bargain. Storage needs to improve in all three vectors to either push innovation forward, or avoid being viewed as a bottleneck in the infrastructure. At Forrester I will be looking at a number of areas of innovation which should drive enterprise storage capabilities to new heights including:
In 2011, my colleague James Staten and I published two light-weight vendor assessments on the private cloud and public cloud market. These solutions sit at the extremes of the IaaS market. To kick off 2013, I published a full vendor evaluation of a market that sits in between these two IaaS deployment types — hosted private cloud. Forrester's Forrsights Hardware Survey, Q3 2012 showed that 46% of enterprises are prioritizing investments in private clouds in 2013. While slightly more than half plan to build a private cloud in their own data center, more than 25% said they prefer to rent one. Hosted private cloud opens the door to a variety of benefits: 1) You reach cloud from day one. 2) Compute is dedicated from other clients. 3) It can enable future hybrid scenarios. 4) Easier-to-meet licensing and compliancy requirements. 5) Outsourcing the setup of the cloud and management of the infrastructure to focus on support and utilization.
Overall this report revealed no leaders, but it did show some strengths and weaknesses across the market and provide framework and sample criteria to assess vendors within this space. This research process also revealed some unexpected nuances within this space:
Hosted private cloud and virtual private cloud are often used interchangeably within the market — despite being distinct deployment types.
Level and method of dedication varies greatly by solution.
Layers managed differ greatly by solution.
Although agility is a benefit, few enable self-service access to resources to its end users. Ticket-based request systems are common.
Many enterprises are using hosted private cloud for some unexpected advantages:
Hybrid clouds are especially subject to the law of unintended consequences, says Forrester’s cloud expert James Staten. Many IT organizations don’t even acknowledge that they have a hybrid cloud. The reality: If enterprises are using public cloud software-as-a-service (SaaS) and/or deploying any custom applications in the public cloud, then by definition they have a hybrid cloud, because it almost always connects to the back end.
In this episode of TechnoPolitics, James implores CIOs and IT professionals to get serious about hybrid cloud now to avoid spaghetti clouds in the future.
Over the last couple of years, I've fielded a number of inquiries from Forrester clients who are trying to decide whether their company should move their email and other collaboration workloads into the cloud via Google Apps for Business or Microsoft Office 365. This conversation has gained so much momentum that I recently did a podcast with my colleague Mike Gualtieri on the subject, will host a teleconference covering the topic on February 26, and will soon publish a report detailing answers to five of the common questions that we get about online collaboration and productivity suites (which include Office 365, Google Apps, and IBM SmartCloud for Social Business). Fueling this extended conversation are business and IT leaders' deliberations over one question: Is there a right or wrong in selecting one vendor's offering over the other? I'll use a typical analyst hedge to answer: It depends.
It's not controversial that business success today depends more than ever on IT performance. Business processes and IT operations are highly interdependent and tightly linked. Alignment between the two is no longer an option—it’s a requirement to stay competitive. Your business customers won’t succeed in today’s dynamic economy without IT behind them, but business customers care about outcomes, not technologies. The more you can think like they do, the better your relationship will be, the better your outcomes will be, and frankly, the better your future job prospects will be.
Forrester calls the evolution of IT from a provider of technologies to a broker of business services the “IT to BT (business technology) transformation.” Key to this shift is rethinking IT’s role in the enterprise and, in particular, rethinking current IT processes and the tools used to support them. Many IT organizations have improved workload, application release, run-book, data transfer, and virtual machine management processes, to name a few, through automation—yet still fail to deliver the agility and responsiveness their business customers demand.
Today’s announcements at the Open Compute Project (OCP) 2013 Summit could be considered as tangible markers for the OCP crossing the line into real relevance as an important influence on emerging hyper-scale and cloud computing as well as having a potential bleed-through into the world of enterprise data centers and computing. This is obviously a subjective viewpoint – there is no objective standard for relevance, only post-facto recognition that something was important or not. But in this case I’m going to stick my neck out and predict that OCP will have some influence and will be a sticky presence in the industry for many years.
Even if their specs (which look generally quite good) do not get picked up verbatim, they will act as an influence on major vendors who will, much like the auto industry in the 1970s, get the message that there is a market for economical “low-frills” alternatives.
Major OCP Initiatives
To date, OCP has announced a number of useful hardware specifications, including:
With a couple of months' perspective, I’m pretty convinced that Intel has made a potentially disruptive entry in the market for programmable computational accelerators, often referred to as GPGPUs (General Purpose Graphics Processing Units) in deference to the fact that the market leaders, NVIDIA and AMD, have dominated the segment with parallel computational units derived from high-end GPUs. In late 2012, Intel, referring to the architecture as MIC (Many Independent Cores) introduced the Xeon Phi product, the long-awaited productization of the development project that was known internally (and to the rest of the world as well) as Knight’s Ferry, a MIC coprocessor with up to 62 modified Xeon cores implemented in its latest 22 nm process.