Big Iron Lives — Huawei Shows Off KunLun 32S x86 Server

Richard Fichera

I was recently at an event that Huawei hosted in Latin America for its telecom carrier community, in which Huawei was showing off an impressive range of carrier-related technology, including distributed data center management, advanced analytics and a heavy emphasis on compute and storage in addition to their traditionally strong core carrier technology. Interestingly they chose this venue for the Latin America unveling of the KunLun server, an impressive bit of engineering which clearly shows that innovation in big-iron x86 servers is not dead. There is some confusion about whether the March announcement at CeBIT constituted the official unveiling of the actual machine, but they had a real system on the floor at this event and claimed it was the first public showing of the actual system.

The Kunlun server, named after a mountian range in Quinghai Province, places Huawei squarely up against the highest end servers from HPE, IBM, Oracle, NEC and Fujitsu, with a list of very advanced RAS features, including memory migration, hot memory and CPU swap, predictive failure diagnostics and a host of others, some enabled by the underlying Xeon E7 technology and others added by Huawei through their custom node controller architecture ( essentially a standard feature of all large x86 servers). Partitionable into smaller logical servers, the Kunlun can serve as a core transaction processor for extreme workloads or as a collection of tightly coupled electrically and logically isolated servers.

So why unveil this high-end engine at a telecom carrier show? My read is that since the carriers will be at the center of much of the IoT action, and that the data streams they process will need an ever expanding inventory of processing capacity, so this is a pretty good venue, Plus it reinforces the emerging primacy of analytics, especially in-memory analytics, which it can address extremely well with its current 24TB (32G DIMMs) of DRAM.

Read more

Microsoft Fights Back In Education

JP Gownder

Today, Microsoft's Terry Myerson announced the new strategy for Windows in the classroom. Windows 10 -- which is now Windows-as-a-service, with periodic updates delivered from the cloud -- will see a big feature update this summer with the Windows Anniversary Update, announced a few weeks ago at the BUILD developer conference. Now we're learning about the education-specific features that will take on Chromebooks.

It's no secret that Google's Chromebooks have taken the education market by storm; they now constitute more than half of shipments of new devices sold to U.S. schools. Some schools are even re-imaging old Windows PCs into Chromebooks. As a result, both Apple and Microsoft have seen their positions in the educational market slide south over the past four years.

Why does this matter? Well, for the obvious device sales implications, of course. But it's part of a longer-term customer relationship issue, too: If young people grow up not knowing Windows, will they ever care about the platform? Tomorrow's Windows customers could be shaped in today's classroom... or tomorrow's Chromebook customers could be.

For schools, Windows Anniversary Update will address key issues in education:

Read more

Oculus’ Botched Launch Harms The VR Ecosystem

JP Gownder

April 12, 2016: The day Oculus updated its Rift shipment timeframe for customers. As has been widely reported, Oculus customers face widespread months-long delays in the deliveries of their virtual reality headset purchases. To add a personal anecdote, I ordered within the first 5 minutes of the pre-launch window (once the web site started working, which it didn’t at first), and my Rift shipment has been delayed from March 30th to “between May 9 and 19th,” assuming Oculus actually succeeds in meeting its new dates.

While my personal Rift delay is merely an annoyance, the botched launch has real repercussions for the VR ecosystem. Oculus’ delay:

  • Hurts developers of games and apps. The diversity and depth of the VR developer ecosystem is impressive. While many developers focus on games – logically enough, since that’s a key early adopter demographic – others offer applications ranging from clinical treatments for PTSD to collaboration in virtual spaces. The common denominator? None of these developers are making money if there are no headsets available. And while many apps can be ported to other platforms, Oculus has been the centerpiece of many developers’ high-end VR efforts.
  • Hurts media startups and innovations. Media, too, sees a potential loss. While some media companies go the route of the New York Times and focus on Google Cardboard phone-based VR, others are counting on developing truly immersive experiences that simulate presence. Studio Jaunt VR has an Oculus app that, again, won’t be addressable until customers receive their Rifts.
Read more

Why High Performance People Need High Performance Technology

David Johnson

My colleagues and I have spent the last four years studying the links between technology, human performance at work, customer experience, and the financial performance of companies. One fascinating insight we’ve learned is that what separates the highest performing people in their work from others is their ability to reliably focus their attention, bringing more of their cognitive resources to bear on their work each day than their colleagues do. It’s not easy in our distraction-rich, techno-charged world.

There’s plenty of research that proves that happy employees are more productive, but Drs. Teresa Amabile and Steven J. Kramer made an important discovery in 2010 that turns conventional wisdom about where happiness at work comes from, upside down. The most powerful source of happiness at work isn’t money, free food or recognition, but rather getting things done; making progress every day toward work that we know is important. The more conducive our work environment is to staying focused, and the better we are at suppressing the sources of distraction within ourselves to get our most important work done, the happier we will be at work. And, the effect is even stronger for work that requires creativity and problem solving skills.

Unfortunately in our workforce technology research, technology distraction isn't on the list of things leaders are concerned about. It should be, because the most pernicious sources of distraction employees face are the ones that lie beyond their control - the distractions that originate from the technologies their employers require them to use, when there are no alternatives.

Read more

Reality Check: Virtual Reality Isn’t A Real Market. Yet.

JP Gownder

You’re probably hearing a lot of endless, excessive and short-term virtual reality (VR) hype. For example, at SXSW 2016, a great deal of time and energy is being devoted to VR experiments, new media announcements, and demonstrations. 

The reality? The vast majority of consumers aren’t there yet, don’t know or care about VR, and won’t know or care in 2016 unless they are hardcore gamers. And only a few forward-looking enterprises – digital predators – are experimenting with VR in effective ways today.

At Forrester, we believe that VR will find its place in the pantheon of important computing platforms, eventually reshaping the way workers work, enterprises interact with customers, and consumers perform a variety of tasks. In other words, it's going to be a real market... at some point.

Too many clients think that VR is a platform that they simply must address in 2016. We think that’s premature. Even in the era of hyperadoption, VR must overcome key obstacles to gain mass market status:

  • Need for market education. Most consumers don’t have a deep understanding of VR, nor is there an easy venue for them to learn about it. Retailing represents a challenge: Buyers must experiment with a headset for many minutes to even get sense of what the technology does. In past technology revolutions (smartphones, tablets), the Apple Store played this role… but Apple isn’t in the VR game (yet).
Read more

Software-Defined Data Center, coming ready or not!

Robert Stroud

As we embark in the era of “cloud first” being business as usual for operations, one of the acronyms flying aground the industry is SDDC or the Software Defined Data Center.  The term, very familiar to me since starting with Forrester less than six months ago, has become an increasing topic of conversation with Forrester clients and vendors alike. It is germane to my first Forrester report “Infrastructure as Code, The Missing Element In The I&O Agenda”, where I discuss the changing role of I&O pros from building and managing physical hardware to abstracting configurations as code. The natural extension of this is the SDDC.

 

We believe that the SDDC is an evolving architectural and operational philosophy rather that simply a product that you purchase. It is rooted in a series of fundamental architectural constructs built on modular standards-based infrastructure, virtualization of and at all layers, with complete orchestration and automation.

 

The Forrester definition of the SDDC is:

 

A SDDC is an integrated abstraction model that defines a complete data center by means of a layer of software that presents the resources of the data center as pools of virtual and physical resources, and allows them to be composed into arbitrary user-defined services.

 

Read more

Infrastructure As Code, The Missing Element In The I&O Agenda

Robert Stroud

For many years, infrastructure and operations (I&O) professionals have been dedicated to delivering services at lower costs and ever greater efficiency, but the business technology (BT) agenda requires innovation that delivers top-line growth.

 

The evolution and success of digital business models is leading I&O organizations to disrupt their traditional infrastructure models to pursue cloud strategies and new infrastructure architectures and mindsets that closely resemble cloud models.

 

Such a cloud-first strategy supports the business agenda for agility, rapid innovation, and delivery of solutions. This drives customer acquisition and retention and extends the focus beyond ad hoc projects to their complete technology stack. The transition to cloud-first mandates a transition for infrastructure delivery, management, and maintenance to support its delivery and consumption as a reusable software component. Such infrastructure can be virtual or physical and consumed as required, without lengthy build and deployment cycles.

 

Growing cloud maturity, the move of systems of record to the cloud (see my blog “Driving Systems of Records to the Cloud, your focus for 2016!)container growth, extensive automation, and availability of "infrastructure as code" change the roles within I&O, as far less traditional administration is needed. I&O must transition from investing in traditional administration to the design, selection, and management of the tooling it needs for composable infrastructure.

 

Read more

Azure Stack Preview – Microsoft-s End-Game for On-Premise IT?

Richard Fichera

What’s Happening?

In 2014 I wrote about Microsoft and Dell’s joint Cloud Platform System offering, Microsoft’s initial foray into an “Azure-Like” experience in the enterprise data center. While not a complete or totally transparent Azure experience, it was a definite stake in the ground around Microsoft’s intentions to provide enterprise Azure with hybrid on-premise and public cloud (Azure) interoperability.

I got it wrong about other partners – as far as I know, Dell is the only hardware partner to offer Microsoft CPS – but it looks like my idiot-proof guess that CPS was a stepping stone toward a true on premise Azure was correct, with Microsoft today announcing its technology preview of Azure Stack, the first iteration of a true enterprise Azure offering with hybrid on-prem and public cloud interoperability.

Azure Stack is in some ways a parallel offering to the existing Windows Server/Systems Center and Azure Pack offering, and I believe it represents Microsoft’s long-term vision for enterprise IT, although Microsoft will do nothing to compromise the millions of legacy environments who want to incremental enhance their Windows environment. But for those looking to embrace a more complete cloud experience, Azure Stack is just what the doctor ordered – an Azure environment that can run in the enterprise that has seamless access to the immense Azure public cloud environment.

On the partner front, this time Microsoft will be introducing this as a pure software that can run on one or more standard x86 servers, no special integration required, although I’m sure there will be many bundled offerings of Azure Stack and integration services from partners.

Read more

Announcing the Forrester Wave: Private Cloud Software Suites, Q1 2016

Lauren Nelson

Anyone familiar to Forrester knows the Wave drill: 1) we take the top vendors in a space, 2) do a massive data collection process, 3) evaluate each, and 4) share our findings. We've been evaluating this market since Q2 2011. For our 2016 evaluation, we used 40-criteria to evaluate the following vendors -- BMC, Cisco, Citrix, HPE, Huawei, IBM, Microsoft, Red Hat, and VMware.

Read more

Refinance And Refocus: Verizon, CenturyLink and Windstream Enter 2016 With Leaner Strategies

Sophia Vargas

While mergers and acquisitions have proliferated in the colocation industry - each positioned to increase geographic coverage or higher order capabilities – in the last 6 months, a new trend has emerged: strategic divestitures, most prominently observed in the telecommunications space. Following the complete cycle, in 2010 and 2011, Centurylink, Verizon and Windstream made strategic acquisitions to increase their data center services portfolios, acquiring Savvis, Terremark and Hosting Solutions respectively. 5 years later, each firm has announced its intent to sell of some or all of these assets. 

So, what went wrong?

While telcos had arguably given birth to colocation, the fact remains that network and carrier providers have had troubling competing against pure play colocation and data center service providers like Equinix and Digital Realty. In the past, telecom providers described colocation and data center services as a way to enrich existing customer contracts. In an interesting twist, these new intended divestitures have been presented as a way to refinance core assets, focus on what drives their business, and move away from standardized services with high overhead and lower margins.  While vendors may keep their skeletons in the closet, I had some speculation as to what might be fueling these decisions:

-          Buyers want carrier density and diversity.  Even though all of these facilities support multiple connections into other carriers, customers tend to evaluate facilities by connectivity options instead of looking for carriers to provide data center capacity on top of network services. Additionally, many geographically dispersed companies are considering blended IP solutions to improve latency and performance across the globe.

Read more