According to our survey data dating back to 2008, despite year after year of high profile security breaches from Heartland Payment Systems to Wikileaks to Sony, security budgets have only increased by single digits. This is hardly enough to keep up with the increasing sophistication of attacks, the avalanche of breach notification laws and the changing business and IT environment.
The changing business and IT environment is perhaps the greatest concern. With a massive explosion of mobile devices and other endpoint form factors and an ever expanding ecosystem of customers, partners, clouds, service providers and supply chains, you increasingly have less and less direct control over your data, your applications and end-user identities. We refer to this expanding ecosystem as the “extended enterprise.” An extended enterprise is one for which, a business function is rarely, if ever, a self-contained workflow within the infrastructure boundaries of the company. We believe that the extended enterprise is such a major shift for CISOs and security professionals that we dedicated our upcoming Security Forum to it as well as a significant stream of research.
I just spent most of this week at the annual itSMF conference called Fusion, held this year at the sprawling Gaylord National Resort below Washington DC. As always, it was a wonderful gathering of some of the finest people I know. When you’ve been involved in the IT service management field as long as I have, you get to know a LOT of these people very well. In fact, when I delivered the closing keynote of Fusion in 2009, I opened by saying, “This feels like a family reunion … except I like you more!” I was only half joking because many of these people ARE like family and I do indeed like them.
As Forrester’s “automation guy” I often make statements about the flaws of the people in IT. I always try to inject some comedy into these statements because we have to be able to laugh at ourselves. There is a serious side to this position, however. There are now just under 7 billion idiots on this planet and none of us is exempt from that characterization. People do dumb things. We all do. Hopefully, we do more start things than dumb things. Since we do dumb things, we need to protect ourselves from ourselves.
ITSM is one of many mechanisms that offers such protection. We need ITSM because IT has rightfully earned an awful reputation for chaotic execution. It seems that IT is one of the most egregious demographic groups exemplifying human error and sloppiness. It is full of smart people doing dumb things. We in IT have a very serious problem.
My colleague, Lutz Peichert, recently wrote a blog about the need for continuous risk management as it relates to your IT supplier base. While his focus was more on monitoring software and hardware vendor risk, I want to step up and remind IT services buyers the same thing. As I look at what’s happening in the steaming hot global IT services market and at the increased responsibility and access IT service providers are being given today (see Maintaining Vendor Management Vigilance In The Overheated Global Sourcing Market), I can’t help but worry that a single outage or bankruptcy or fraud or bad acquisition could spell disaster for a client. SVM executives have to continuously assess their IT services vendors’ viability and ensure that they have alternate options in case of vendor failure.
Publicly traded companies are obviously much easier to monitor due to their financial transparency; however there is still potential for fraud (as we saw with Satyam and Longtop Group) or M&A activity that may not be reported in standard sources, but that may leave customers in an unfavorable position. So, for “critical” suppliers in your portfolio, due diligence should include research outside of normal channels – social media, job websites, financial analysts (who have a pulse on M&A activity and the health of the suppliers’ revenues and profit margins).
It seems that every week another vendor slaps “big data” into its marketing material – and it’s going to get worse. Should you look beyond the vendor hype and pay attention? Absolutely yes! Why? Because big data has the potential to shape your market’s next winners and losers.
At Forrester, we think clients must develop an intuitive understanding of big data by learning: 1) what is new about it; 2) what it is; and 3) how it will influence their market.
What is new about big data? We estimate that firms effectively utilize less than 5% of available data. Why so little? The rest is simply too expensive to deal with. Big data is new because it lets firms affordably dip into that other 95%. If two companies use data with the same effectiveness but one can handle 15% of available data and one is stuck at 5%, who do you think will win? The deal, however, is that big data is not like your traditional BI tools; it will require new processes and may totally redefine your approach to data governance.
I recently published an update on power and cooling in the data center (http://www.forrester.com/go?docid=60817), and as I review it online, I am struck by the combination of old and new. The old – the evolution of semiconductor technology, the increasingly elegant attempts to design systems and components that can be incrementally throttled, and the increasingly sophisticated construction of the actual data centers themselves, with increasing modularity and physical efficiency of power and cooling.
The new is the incredible momentum I see behind Data Center Infrastructure Management software. In a few short years, DCIM solutions have gone from simple aggregated viewing dashboards to complex software that understands tens of thousands of components, collects, filters and analyzes data from thousands of sensors in a data center (a single CRAC may have in excess of 20 sensors, a server over a dozen, etc.) and understands the relationships between components well enough to proactively raise alarms, model potential workload placement and make recommendations about prospective changes.
Of all the technologies reviewed in the document, DCIM offers one of the highest potentials for improving overall efficiency without sacrificing reliability or scalability of the enterprise data center. While the various DCIM suppliers are still experimenting with business models, I think that it is almost essential for any data center operations group that expects significant change, be it growth, shrinkage, migration or a major consolidation or cloud project, to invest in DCIM software. DCIM consumers can expect to see major competitive action among the current suppliers, and there is a strong potential for additional consolidation.
Yes, but you must adapt by demonstrating your ability to drive business growth and differentiation, not just cost savings and uptime. Here’s a personal example of a much broader trend as to why this is so important to your business and your role as an I&O professional:
It’s a cool Autumn day, which reminds me I need a new jacket. I walk into Patagonia. I evaluate several models and then buy one – but not from Patagonia. It turns out a competitor located two miles away is offering the jacket at a discount. How did I know this? I scanned the product's bar code using the RedLaser app on my iPhone, which displayed several local retailers with lower prices. If I had been willing to wait three days for shipping, I could have purchased that same jacket while standing in Patagonia from an online retailer with an even better deal. [Truth be told: I actually bought the jacket from Patagonia's store after validating no better deals existed… but The Home Depot wasn’t so lucky this summer when I bought the same, but cheaper air conditioner from Amazon while standing in aisle 4.]
This is a prime example of what Forrester calls the “The Age Of The Customer” where empowered buyers have information at their fingertips to check a price, read a product review, or ask for advice from a friend right from the screen of their smartphone. This type of technology-led disruption is eroding traditional competitive barriers across all industries; manufacturing strength, distribution power, and information mastery can't save you.
Development leaders! Project leaders and business analysts! Application and solution architects! Want to move forward on your business technology (BT) journey and be viewed by your business stakeholders as a valuable team member? Take a tip from last week's Forums held in Boston. Embrace Business Process Management (BPM) And Customer Experience. Don't ignore them, embrace them. Why? They're essential to helping you achieve your business outcomes.
I know, I know. You read the above and now think "Gee Kyle, what's next? Going to enlighten me on some new BPM or customer experience management technology that's going to transform my very existence, my company's future?"
Nope. Let me explain....
Last week we hosted more than 250 of your application development and delivery and business process peers in Boston and focused on how to succeed in the new world of customer engagement. The most impactful discussions I heard were the side conversations we held with attendees, sometimes occurring over dinner and cocktails. We didn't discuss technology. We discussed the skills your peers were developing in two fundamental areas:
BPM - no, not the technology but the Lean and Six Sigma based methods, techniques, and tools organizations use to focus on business processes and not functions; to strive for continuous improvement; and to focus on customer value.
Customer experience - defined more eloquently by my peer Harley Manning, but I'll summarize as the methods, techniques, and tools used to understand how customers perceive their interactions with your company.
Amazon’s product strategists shocked some constituencies with their $199 price point for the Amazon Kindle Fire tablet announced today. But there’s a fundamental product strategy lesson to this pricing, and it’s an old product strategy model: The so-called Razor-Razorblade Pricing model.
We all know this model well, as consumers: your initial purchase of razor is relatively cheap, but the cost of replacement razorblades really adds up over time. If you don’t buy razors, perhaps you’re familiar with this scenario from your inkjet printer. Remember how cheap that scanner/printer was -- but have you ever seen the price of refill inkjet cartridges?
The Razor-Razorblade model works when “dependent goods” – the refills, the stuff you need to keep buying to use the product – are closely related to the anchor product. In the case of the Amazon Fire Tablet, the dependent goods are content and services – MP3s, streaming videos, and of course books, magazines, newspapers, etc. and cloud services that allow you to store and synchronize your content across devices. Amazon’s product strategists can afford to charge a low entry price to raise adoption of the device, and then (they hope) deliver an experience that’s attractive enough for Kindle Fire owners to pay for as a service.
Hence Amazon CEO Jeff Bezos’ portrayal of the Kindle Fire product strategy: “What we are doing is offering premium products at non- premium prices,” Bezos says. Other tablet contenders “have not been competitive on price” and “have just sold a piece of hardware. We don’t think of the Kindle Fire as a tablet. We think of it as a service.”
Much of the discussion around integrating applications with the Internet has centered on mobile applications connected to web backends that deliver greater customer experiences than mobile apps or web sites could by themselves. But the real power of this concept comes when a full ecosystem can be delivered that leverages the true power and appropriateness of mobile, desktop and cloud-based compute power. And if you want to see this in action, just look to Autodesk. The company, we highlighted in this blog last year for its early experimentation with cloud-based rendering, has moved that work substantially forward and aims to change the way architects, engineers and designers get their jobs done and dramatically improve how they interact with clients.