The movement to cloud is fast changing the method companies will deploy and consume security services. The number one issue that drives the adoption of managed security services (MSS) and the business of managed security service providers (MSSPs) is complexity reduction. As companies replace premise-based data centers with virtual cloud data centers, the expectations of these customers will change as well, they look for elastic ways to purchase security services, as well as, methods that allow for the active defense of both cloud, and premise based workloads. Consider the following:
We have heard that the perimeter is dead, and many ways it is. We name the normal assassins and they include outsourcing, mobile solutions, and the cloud.
Another truism is that companies never wanted to be in the information technology business in the first place. Information technology has brought real productivity improvements but it has also brought significant costs.
Moving information technology to the cloud provides companies the opportunity to reallocate costs from capital expenditures to operational expenditures and reassign operations staff to other roles.
Formula One has gotten us all used to amazing speed. In as little as three seconds, F1 pit teams replace all four wheels on a car and even load in dozens of liters of fuel. Pit stops are no longer an impediment to success in F1 — but they can be differentiating to the point where teams that are good at it win and those that aren’t lose.
It turns out that pit stops not only affect speed; they also maintain and improve quality. In fact, prestigious teams like Ferrari, Mercedes-Benz, and Red Bull use pit stops to (usually!) prevent bad things from happening to their cars. In other words, pit stops are now a strategic component of any F1 racing strategy; they enhance speed with quality. But F1 teams also continuously test the condition of their cars and external conditions that might influence the race.
My question: Why can’t we do the same with software delivery? Can fast testing pit stops help? Today, in the age of the customer, delivery teams face a challenge like none before: a business need for unprecedented speed with quality — quality@speed. Release cycle times are plummeting from years to months, weeks, or even seconds — as companies like Amazon, Netflix, and Google prove.
Well, it’s now been about nine months, and time to check in on the gestation of the DATA Act. But before we start on what’s happened since the law passed on May 9, 2014, let’s take a quick look at what it is, and what government organizations have to work with.
This bipartisan legislation – jointly sponsored by two democrats and two republicans – is an effort to modernize the way the government collects and publishes spending information – in particular by establishing standard elements and formats for the data. The new law assigns responsibility for the task, sets out a four-year timetable for implementation, and establishes a strict oversight regime to measure compliance in the adoption of the standards and the subsequent quality and timeliness of the published spending data. That oversight is the big difference between the DATA Act and the previous legislation to improve funding transparency. This time someone is watching, and the law has teeth.
Telefónica entered into an exclusivity agreement with Hutchison Whampoa regarding Hutchison’s potential acquisition of the Telefónica subsidiary O2 UK for £10.25 billion in cash, valuing the deal at an estimated 7.5 times 2014 EV/EBITDA. The Hutchison-O2 UK deal — should it complete — will entirely redraw the telco landscape in the UK in terms of market shares. The acquisition of O2 UK will transform Hutchison from the smallest mobile operator with 7.5 million customers to the largest with 31.5 million customers and reduce the number of mobile operators in the UK from four to three.
This development follows on the heels of the announcement by Orange and Deutsche Telekom that they have entered into exclusive negotiations with BT Group regarding a potential divestment of 100% of their shares in EE, their joint venture in the UK. The increased merger activity is not surprising, and we predicted as much in our report Predictions 2015: Telecoms Will Struggle To Align To The CIO's BT Agenda. Still, these deals raise important questions for the European telecoms markets:
Customers are using more communication channels for customer service than ever before. They are also contacting customer service organizations more frequently. Companies are rising to this challenge as overall satisfaction with the quality of service over all communication channels is trending upwards.
Moreover, customers have little appetite for long or difficult service interactions, including navigating arduous interactive voice response (IVR) menus to connect with an agent or waiting in queues to be connected to a phone agent; and are increasingly turning to self-service as the easiest path to service resolution. Here are some key takeaways from our latest consumer survey about channel usage for customer service.
For the first time in the history of our survey, respondents reported using the FAQ pages on a company's website more often than speaking with an agent over the phone. Use of the help/FAQ pages on a company's website for customer service increased from 67% in 2012 to 76% in 2014, while phone interactions have remained constant at a 73% usage rate.
Other self-service channels also see increased usage since 2012. For example, use of communities and virtual agents jumped by over 10 percentage points each. We also see robust uptake of speech and mobile self-service channels.
Self-service adoption increased across all generations from 2012 to 2014, with the largest increases attributable to older boomers (ages 59-69) and the golden generation (ages 70+).
Online chat adoption continues to rise – from 38% in 2009 to 43% in 2012 to 58% in 2014. Screensharing, cobrowsing and SMS are other channels that are increasing in popularity among the young and old alike.
After a brief hiatus for the holidays, the S&R podcast is back! For those who are new to the podcast, each month we use our First Look newsletter and podcast to highlight one of the terrific analysts on Forrester's Security and Risk team. The podcast and newsletter are great ways for Forrester readers to get to know a little more about the analysts writing the reports. This month we spotlight 4-year Forrester vet Ed Ferrara, one of our vice presidents and principal analysts focused on security strategy, budgets, metrics, consultancies, and managed services — all the topics that you want to tackle at the beginning of a new year.
Click below to listen to the podcast! If you're not signed up for our newsletters, I highly encourage you to do so; please email firstname.lastname@example.org for additional details.
To download the mp3 version of the podcast, click here.
One of Microsoft’s announcements today is the overhaul of its digital whiteboard formerly called PPI — now rebranded as Surface Hub. Surface Hub is an 84" 4K resolution (or 55" without 4K) all-in-one touchscreen computer with collaboration features for conference rooms. The market for this device is primarily industries with requirements for large screen visualization, which there are many: Manufacturing, healthcare, higher education, publishing, architecture, engineering, and oil & gas being prime examples.
However, digital whiteboards are increasingly attractive to all organizations. We see a bifurcation of conference room equipment for visual communications: On the low end more companies are putting just USB webcams in ad hoc collaboration spaces. On the high end we're getting inquiries from customers taking another look at specialized hardware, but uninterested in telepresence for cost or functionality reasons. For customers creating these specialized collaboration rooms, whiteboarding and application sharing are just as important as video.
Three initial impressions from Microsoft’s announcement:
It’s not often that a new product release has the potential to reshape the way people work and play. The PC, the browser, the smartphone – all of these products fell into that category.
Microsoft’s new HoloLens has the potential to do the same. (Check out some photos from Gizmodo here -- they don't live up to the actual experience even a little bit -- and this video, which doesn't do it justice, either).
Yes, that’s a big claim. But I’m here to challenge your thinking with this assertion: Over the next few years, HoloLens will set the bar for a new type of computing experience that suffuses our jobs, our shopping experiences, our methods for learning, and how we experience media, among other life vectors. And other vendors will have to respond to this innovation in holographic, mixed reality computing.
Microsoft’s event, Windows 10: The Next Chapter, showcased an impressive vision and plan for: 1) transforming Windows, including free upgrades; 2) gaining relevance in mobile; 3) launching a new computing experience with HoloLens; and 4) reinventing group collaboration with Surface Hub.
Based on what we saw today and on background conversations, Forrester believes that Microsoft’s Windows 10 will persuade enterprises and consumers to upgrade from Windows 7 (something Windows 8 didn’t do) and be an easy upgrade from Windows 8. Getting the world’s 1.5 billion Windows PCs on this new software platform will re-establish Microsoft’s dominance in personal computing with a mobile extension.
The technology advances in Windows 10 include a single and integrated experience across PCs and tablets with a single platform and app store; continuous software improvements, and big security improvements, and a new set of enterprise features. (Forrester clients can get our enterprise perspective in our new report, ‘Microsoft Gets Its Flagship OS Back On Track With Windows 10.) Windows 10 even brings together Windows and Xbox -- giving gamers and game developers access to Xbox experiences on the PC.
HoloLens Is A Powerful New Technology To Deliver Mixed Reality Experiences
On one level, IBM’s new z13, announced last Wednesday in New York, is exactly what the mainframe world has been expecting for the last two and a half years – more capacity (a big boost this time around – triple the main memory, more and faster cores, more I/O ports, etc.), a modest boost in price performance, and a very sexy cabinet design (I know it’s not really a major evaluation factor, but I think IBM’s industrial design for its system enclosures for Flex System, Power and the z System is absolutely gorgeous, should be in the MOMA*). IBM indeed delivered against these expectations, plus more. In this case a lot more.
In addition to the required upgrades to fuel the normal mainframe upgrade cycle and its reasonably predictable revenue, IBM has made a bold but rational repositioning of the mainframe as a core platform for the workloads generated by mobile transactions, the most rapidly growing workload across all sectors of the global economy. What makes this positioning rational as opposed to a pipe-dream for IBM is an underlying pattern common to many of these transactions – at some point they access data generated by and stored on a mainframe. By enhancing the economics of the increasingly Linux-centric processing chain that occurs before the call for the mainframe data, IBM hopes to foster the migration of these workloads to the mainframe where its access to the resident data will be more efficient, benefitting from inherently lower latency for data access as well as from access to embedded high-value functions such as accelerators for inline analytics. In essence, IBM hopes to shift the center of gravity for mobile processing toward the mainframe and away from distributed x86 Linux systems that they no longer manufacture.