Linux vs Unix Hot Patching – Have We Reached The Tipping Point?

Richard Fichera

The Background – Linux as a Fast Follower and the Need for Hot Patching

No doubt about it, Linux has made impressive strides in the last 15 years, gaining many features previously associated with high-end proprietary Unix as it made the transition from small system plaything to core enterprise processing resource and the engine of the extended web as we know it. Along the way it gained reliable and highly scalable schedulers, a multiplicity of efficient and scalable file systems, advanced RAS features, its own embedded virtualization and efficient thread support.

As Linux grew, so did supporting hardware, particularly the capabilities of the ubiquitous x86 CPU upon which the vast majority of Linux runs today. But the debate has always been about how close Linux could get to "the real OS", the core proprietary Unix variants that for two decades defined the limits of non-mainframe scalability and reliability. But "the times they are a changing", and the new narrative may be "when will Unix catch up to Linux on critical RAS features like hot patching".

Hot patching, the ability to apply updates to the OS kernel while it is running, is a long sought-after but elusive feature of a production OS. Long sought after because both developers and operations teams recognize that bringing down an OS instance that is doing critical high-volume work is at best disruptive and worst a logistical nightmare, and elusive because it is incredibly difficult. There have been several failed attempts, and several implementations that "almost worked" but were so fraught with exceptions that they were not really useful in production.[i]

Read more

Google Invites Us To "Daydream" In Virtual Reality

JP Gownder

At Google I/O today, the company announced a new mobile-centric VR offering called Daydream. The nicely-named Daydream VR builds off of a mobile device platform, much as the Samsung Gear VR add-on device does for Samsung S7 smartphones. Daydream combines three elements:

  • Android N smartphones optimized for VR. As Samsung has shown with its successful Gear VR efforts, it takes a high-end smartphone with deep pixel density and great graphics performance to effectively drive VR experiences. Google announced that a variety of handset vendors -- including Samsung, LG, HTC, Huawei, and others -- will release smartphones that meet the new Daydream standard. Additionally, Android N will include a number of VR-specific performance optimization features.
  • Reference design for headset and controller. Google also announced a reference design for both a headset and a controller (see photo). Importantly, the controller is aware of where it is in 3D space, allowing users to interact more richly with their controller than, say, an unseen controller.
  • Applications and Google Play distribution. Google will move some of its own offerings to Daydream. They've rebuilt Youtube to be more VR-aware, allowing a variety of new video content to be streamed throught Daydream. Google StreetView will come to VR, offering people a more powerful way to explore real-world environments. 
     
Read more

Velocity Mandates DevOps And Continuous Deployment

Robert Stroud

Today’s customers, products, business operations, and competitors are fundamentally digital. Succeeding in this new era mandates everyone constantly reinvent their businesses as fundamentally digital. You have two choices,

·      become a digital predator; or

·      become digital prey.

To compete in this new digital market norm, software applications and products must contain new sources of customer value while at the same time adopting new operational agility. I&O pros need to change from the previous methods of releasing large software products and services at sporadic intervals to continuous deployment. All must adopt key automation technologies to make continuous deployment a reality.

At Forrester, my colleagues and I (including the great Amy DeMartine) developed our recent TechRadar™: Continuous Deployment, Q2 2016 which look at the the top use cases, business value, and outlook of the 12 top technologies engaged in in continuous deployment.

Our key findings include:

Continuous deployment is critical to unlock velocity

In this new era of digital business, I&O pros must automate across the entire software delivery life cycle, creating the ability to continuously deploy while assuring service quality.

No Silver Bullet

Read more

Don't Wait; Rethink Your ITIL Journey Now!

Robert Stroud

Over the past 25 years, many organizations have modelled their support – and in some cases their delivery organization – after the ITIL frameworks and processes. For many, ITIL has been helpful in establishing the rigor and governance that they needed to bring their infrastructure under control in an era where quality and consistency of service was critical and technology was sometimes fragile.

Today, we are 5 years into “The age of the customer” – an era where customer obsession is driving technology and which demands a culture of speed and collaboration to differentiate and deliver extraordinary customer experience to drive business growth. In this era, the rise of mobility and the race to deliver differentiated business processes is critical to success. Your development teams are driving velocity and elasticity with increased quality and availability, leveraging DevOps practices and often driving change directly to production.

This transition has led some organizations to experience friction between the competing priorities, velocity and control, especially for those who continue to execute on the traditional model of ITIL.

ITIL is starting to show signs of age. That does not mean it is on the verge of demise. ITIL must adapt. To understand the relevance of ITIL and IT Service Management practices in this era of Modern Service Delivery, Eveline Oehrlich and Elinor Klavens and I have embarked on a review of ITIL and the use of IT Service Management practices supporting todays BT agenda.

Our key findings include:

ITIL must pivot to support digital transformation

Read more

Big Iron Lives — Huawei Shows Off KunLun 32S x86 Server

Richard Fichera

I was recently at an event that Huawei hosted in Latin America for its telecom carrier community, in which Huawei was showing off an impressive range of carrier-related technology, including distributed data center management, advanced analytics and a heavy emphasis on compute and storage in addition to their traditionally strong core carrier technology. Interestingly they chose this venue for the Latin America unveling of the KunLun server, an impressive bit of engineering which clearly shows that innovation in big-iron x86 servers is not dead. There is some confusion about whether the March announcement at CeBIT constituted the official unveiling of the actual machine, but they had a real system on the floor at this event and claimed it was the first public showing of the actual system.

The Kunlun server, named after a mountian range in Quinghai Province, places Huawei squarely up against the highest end servers from HPE, IBM, Oracle, NEC and Fujitsu, with a list of very advanced RAS features, including memory migration, hot memory and CPU swap, predictive failure diagnostics and a host of others, some enabled by the underlying Xeon E7 technology and others added by Huawei through their custom node controller architecture ( essentially a standard feature of all large x86 servers). Partitionable into smaller logical servers, the Kunlun can serve as a core transaction processor for extreme workloads or as a collection of tightly coupled electrically and logically isolated servers.

So why unveil this high-end engine at a telecom carrier show? My read is that since the carriers will be at the center of much of the IoT action, and that the data streams they process will need an ever expanding inventory of processing capacity, so this is a pretty good venue, Plus it reinforces the emerging primacy of analytics, especially in-memory analytics, which it can address extremely well with its current 24TB (32G DIMMs) of DRAM.

Read more

Microsoft Fights Back In Education

JP Gownder

Today, Microsoft's Terry Myerson announced the new strategy for Windows in the classroom. Windows 10 -- which is now Windows-as-a-service, with periodic updates delivered from the cloud -- will see a big feature update this summer with the Windows Anniversary Update, announced a few weeks ago at the BUILD developer conference. Now we're learning about the education-specific features that will take on Chromebooks.

It's no secret that Google's Chromebooks have taken the education market by storm; they now constitute more than half of shipments of new devices sold to U.S. schools. Some schools are even re-imaging old Windows PCs into Chromebooks. As a result, both Apple and Microsoft have seen their positions in the educational market slide south over the past four years.

Why does this matter? Well, for the obvious device sales implications, of course. But it's part of a longer-term customer relationship issue, too: If young people grow up not knowing Windows, will they ever care about the platform? Tomorrow's Windows customers could be shaped in today's classroom... or tomorrow's Chromebook customers could be.

For schools, Windows Anniversary Update will address key issues in education:

Read more

Oculus’ Botched Launch Harms The VR Ecosystem

JP Gownder

April 12, 2016: The day Oculus updated its Rift shipment timeframe for customers. As has been widely reported, Oculus customers face widespread months-long delays in the deliveries of their virtual reality headset purchases. To add a personal anecdote, I ordered within the first 5 minutes of the pre-launch window (once the web site started working, which it didn’t at first), and my Rift shipment has been delayed from March 30th to “between May 9 and 19th,” assuming Oculus actually succeeds in meeting its new dates.

While my personal Rift delay is merely an annoyance, the botched launch has real repercussions for the VR ecosystem. Oculus’ delay:

  • Hurts developers of games and apps. The diversity and depth of the VR developer ecosystem is impressive. While many developers focus on games – logically enough, since that’s a key early adopter demographic – others offer applications ranging from clinical treatments for PTSD to collaboration in virtual spaces. The common denominator? None of these developers are making money if there are no headsets available. And while many apps can be ported to other platforms, Oculus has been the centerpiece of many developers’ high-end VR efforts.
  • Hurts media startups and innovations. Media, too, sees a potential loss. While some media companies go the route of the New York Times and focus on Google Cardboard phone-based VR, others are counting on developing truly immersive experiences that simulate presence. Studio Jaunt VR has an Oculus app that, again, won’t be addressable until customers receive their Rifts.
Read more

Why High Performance People Need High Performance Technology

David Johnson

My colleagues and I have spent the last four years studying the links between technology, human performance at work, customer experience, and the financial performance of companies. One fascinating insight we’ve learned is that what separates the highest performing people in their work from others is their ability to reliably focus their attention, bringing more of their cognitive resources to bear on their work each day than their colleagues do. It’s not easy in our distraction-rich, techno-charged world.

There’s plenty of research that proves that happy employees are more productive, but Drs. Teresa Amabile and Steven J. Kramer made an important discovery in 2010 that turns conventional wisdom about where happiness at work comes from, upside down. The most powerful source of happiness at work isn’t money, free food or recognition, but rather getting things done; making progress every day toward work that we know is important. The more conducive our work environment is to staying focused, and the better we are at suppressing the sources of distraction within ourselves to get our most important work done, the happier we will be at work. And, the effect is even stronger for work that requires creativity and problem solving skills.

Unfortunately in our workforce technology research, technology distraction isn't on the list of things leaders are concerned about. It should be, because the most pernicious sources of distraction employees face are the ones that lie beyond their control - the distractions that originate from the technologies their employers require them to use, when there are no alternatives.

Read more

Reality Check: Virtual Reality Isn’t A Real Market. Yet.

JP Gownder

You’re probably hearing a lot of endless, excessive and short-term virtual reality (VR) hype. For example, at SXSW 2016, a great deal of time and energy is being devoted to VR experiments, new media announcements, and demonstrations. 

The reality? The vast majority of consumers aren’t there yet, don’t know or care about VR, and won’t know or care in 2016 unless they are hardcore gamers. And only a few forward-looking enterprises – digital predators – are experimenting with VR in effective ways today.

At Forrester, we believe that VR will find its place in the pantheon of important computing platforms, eventually reshaping the way workers work, enterprises interact with customers, and consumers perform a variety of tasks. In other words, it's going to be a real market... at some point.

Too many clients think that VR is a platform that they simply must address in 2016. We think that’s premature. Even in the era of hyperadoption, VR must overcome key obstacles to gain mass market status:

  • Need for market education. Most consumers don’t have a deep understanding of VR, nor is there an easy venue for them to learn about it. Retailing represents a challenge: Buyers must experiment with a headset for many minutes to even get sense of what the technology does. In past technology revolutions (smartphones, tablets), the Apple Store played this role… but Apple isn’t in the VR game (yet).
Read more

Software-Defined Data Center, coming ready or not!

Robert Stroud

As we embark in the era of “cloud first” being business as usual for operations, one of the acronyms flying aground the industry is SDDC or the Software Defined Data Center.  The term, very familiar to me since starting with Forrester less than six months ago, has become an increasing topic of conversation with Forrester clients and vendors alike. It is germane to my first Forrester report “Infrastructure as Code, The Missing Element In The I&O Agenda”, where I discuss the changing role of I&O pros from building and managing physical hardware to abstracting configurations as code. The natural extension of this is the SDDC.

 

We believe that the SDDC is an evolving architectural and operational philosophy rather that simply a product that you purchase. It is rooted in a series of fundamental architectural constructs built on modular standards-based infrastructure, virtualization of and at all layers, with complete orchestration and automation.

 

The Forrester definition of the SDDC is:

 

A SDDC is an integrated abstraction model that defines a complete data center by means of a layer of software that presents the resources of the data center as pools of virtual and physical resources, and allows them to be composed into arbitrary user-defined services.

 

Read more