As we watch the hurricane Sandy news unfold, our thoughts are with the individuals and companies impacted by the storm. It's clear that a disaster of this magnitude will have lasting effects on how IT organizations plan for and react to natural disasters. Best case, I hope business technology leaders take note and learn from the experience, making planning for such events a priority for their organizations. Many of Forrester's largest clients already have comprehensive business resiliency plans in place for events like this, but many others do not. In terms of reacting to Sandy, my colleague Stephanie Balaouras just posted a great checklist of best practice activities for IT leadership teams to consider in the coming days as they work to restore IT operations for their organizations. Longer term, as Forrester's Rachel Dines will tell you, planning for events like this requires a lot more effort.
Forrester's CIO Practice wishes those impacted a safe and speedy recovery.
Every once in a while I get the opportunity to completely immerse myself in one of my client's problems. This time, it was at an IT strategy offsite where a senior director of IT asked me one simple question: "How can we use information technology to help our company open up new streams of revenue?" I found the question refreshing, mainly because nine out of ten CIOs I talk to these days ask the opposite: "how can IT reduce costs?"
Fortunately for me, this client had invited lots of smart people from business and IT and outside experts into the room for two days to explore the revenue question. Challenges familiar to most life sciences companies got folks in the room in the first place: patent expiration foretold eroding profit margins for their blockbuster drugs, and expansion into emerging markets was an avenue for growth, yet one fraught with complexity and uncertain returns.
So this group's charter was to think digital.
Taking inspiration from morning TED talks, four working teams presented ideas for revenue opportunities created by emerging technologies in tech markets like mobile, big data, security, and consumer experiences. The teams cited creative ideas — like algorithms that use public data sources to reduce production forecast variance and improve distribution allocations; modern IT security models, pioneered in financial services, that would enable them to cut the time it takes to on-ramp a new manufacturing partner from months to weeks; company-owned data that could be sold or licensed and made available through public APIs to third parties; and many more.
The teams shared inspiring ideas that were big enough and bold enough to at least warrant further investigation.
Occasionally, I take a look back at my research and see what we got right, and what we got wrong. In 2008, we wrote "Embrace The Risks And Rewards Of Technology Populism" to describe what we saw as the inevitable acceleration of technology change at the edges of organizations. Think mobile, social, and cloud technologies, and the influence they were having on corporate IT. We of course got the name wrong: Technology Populism never really stuck. But four years later, the phenomenon itself is all around us, wrapped up in the more accepted term "Consumerization of IT" (and corporations' defensive response, "Bring-your-own IT"). Every day, we're reminded of the incredible growth of social networks that now influence all aspects of society from traditional media, corporate brands, and even the direction of politics and governments. Every day, we see people around us glued to personal mobile devices -- texting with friends and colleagues, reading news, and checking in on Facebook. And sadly, every day, people walk into their employers and sense the technology they use at work looks older, and runs slower than what they have at home. In fact, it's become hard to remember a world when this wasn’t the case.
But besides the name, we also got other things wrong:
My latest Forrester CIO client visits tell me economic uncertainty is actually helping IT leaders accelerate plans for the future. Sounds counterintuitive, right? Perhaps it’s just because I started in Europe, where the hourly ups and downs of sovereign debt crises cause government policy whiplash paired with market cap acid reflux. Most consider trying to plan anything in this environment, particularly slow-changing corporate IT systems, impossible. Or perhaps, as IT leaders, we’re still just haunted by the great tech recession a decade ago, and we just expect IT budgets will always be the target of corporate austerity efforts.
But one thing is clear: For some, uncertainty breeds paralysis. For others, the very presence of uncertainty offers a platform to drive clever and radical change. Consider two of the many stories about the latter I heard recently:
One IT leader uses uncertainty to reduce his dependency on Microsoft software clients. To be clear, every IT leader I met faces daunting budget pressure. This client’s business is producing basic materials for construction projects globally. Depressed demand for building materials means his company has turned otherwise dormant kilns for firing these materials into ovens for destroying old tires and drugs seized by police. Why? Because finding productive uses for capital investments helps (at least) service debt on that capital when current market demand disappears (and apparently, these kilns burn at such a high heat, they produce zero emissions — that’s cool).
Last week the New York Times Bits blog published an article with some recent Forrester data we published entitled "IT Departments Lose Their Clout Over Phone Choices" here. The article got all the data facts right about people provisioning their own technology, but I was kinda surprised by some of the virulent comments people posted about the article. The first commenter claimed: "This is the beginning of the end of the corporate IT department."
Really? I'm not sure why people jump to this conclusion when they hear the word "IT consumerization," but I've seen it more than once before. I think it's really naive. People typically support this "death of IT" viewpoint with assertions like: 1.) technology is getting easier to use; 2.) people are getting smarter about how to use tech; and 3.) IT people just get in the way. The first and second points are accurate: technology *is* getting easier to use and provision, and generally speaking, our data indicates every generation is getting more tech savvy. That's a good thing we should all celebrate because it likely means lower costs for low value stuff that IT people must do today (e.g. resetting passwords, installing software, distributing software patches, fixing machines, etc.). But to assume this is the only role of IT people shows profound ignorance. And considering consumerization (read the influence of Google, Apple, etc.) has been with us for at least five years and the corporate IT job market has fared better than most occupations, there seems to be no basis in fact that IT is growing less necessary as consumerization rises.
As Matt Brown wrote earlier this month, video is quickly becoming a core technology component of the workplace experience. Henry Dewing predicted the resurgence of investment in video conferencing from the conference room to the desktop in 2007 based on affordable HD-quality video, more user-friendly interfaces, and better interoperability between systems. We're not yet at the tipping point of widespread adoption, but we're moving there rapidly. And when we do, the primary source — and destination — for video content will be the devices broadly provisioned to employees: the desktops, laptops, tablets, and smartphones from which employees will create, publish, and interact with video. Consider some of the trends we've seen accelerate over the past six months:
Mobile, social, video, and cloud collaboration services are quickly becoming four technology legs of a stool supporting what I call the workplace experience. Enterprise investment in these technologies continues to outpace the overall IT market.
Yet taken alone, I'd argue these technologies offer little to no competitive advantage to firms.
Why? One reason: Thanks to the likes of Apple, Google, Microsoft,IBM, Cisco, and a broad array of technology suppliers, virtually every company in the world can now access them. Consider the facts:
Cloud collaboration services: Evidence suggests small companies can put these technologies to use faster than their larger counterparts. Basic business collaboration services can now cost less than a daily cup of coffee to run for employees when provisioned via the cloud. What it means: Barriers to use are low.
Enterprise mobile technologies: Individual employees are able to put the latest mobile devices and apps to productive business use faster than their employers can. Our data suggests the most highly mobile (and highly paid) employee segments (33% of the information workforce) already embrace these tools to make themselves more productive from work, from home, and from the road. What it means: Companies have little control over who uses these.
Enterprise social tools: The current adoption barriers social technologies face in enterprises (by the numbers, it truly is dismal) appear to have more to do with cultural ambivalence and organizational complexity than they do with technology complexity. What it means: Many IT shops have overcome the tech complexity and are now scratching their heads on these other factors.
Competitors to Microsoft Office receive plenty of attention in the blogosphere these days. Whether it’s Google announcing a new mobile or social feature in Docs, Zoho a new API partner, or the recent buzz around the future of Open Office without Oracle -- it’s natural to wonder how much traction these applications are getting with corporate IT.
Open Office has a global presence, although predominantly in government and education. Google Apps for Business has a growing list of customers, although many are using Gmail, not Docs. Overall, alternatives’ take of the office productivity pie — particularly in large enterprises — is still very small.
Yet, we hear from many organizations considering or piloting them. In fact over a third of respondents to our March survey of IT decision-makers with influence over the productivity tool kit claim to be “actively looking at” or “piloting” alternatives. So why does adoption remain so paltry?
I’ve spent most of my career working with IT people making IT decisions on behalf of people who use technology for work. What I love about IT people is their utter devotion to the idea that technology can profoundly change how people work. To improve their productivity, to remove barriers to collaboration, to spark groundswell innovation, and more. Just the other day, I spoke to one named Fred who said to me: “We’re introducing new technologies to change the culture of our organization.”
What a courageous and inspirational idea coming from an IT leader. We’ll just assume he meant to add “…for the better.”
I hear stuff like this all the time, particularly when Content & Collaboration Professionals are planning major initiatives for social technologies, mobile technologies, and collaboration tools inside companies with market caps that dwarf the GDP of entire countries. Big ones. And of course I hear it in the tech trade mags I read, at conferences, and from human capital people fretting about baby boomers turning into octogenarians, and the nano-toting-angry-birds-playing, malcontent Millennials sporting ADD-like technology tendencies at work. I’ll work for a Millennial one day. I’m not critiquing. Just observing.
So back to Fred. Assuming like me you work around folks from IT, I’ll ask: will Fred succeed?
Personally, I don’t gamble in casinos, but I do at work. I place bets on people I hire, budget dollars I spend, arguments I think I’ll win (but often lose), and even on the research ideas analysts on my team come to me with. After all, with luck, the right bets will put my three kids through school someday.
Ever heard a senior leader in your organization proclaim “Everyone’s in sales!”? I have. In fact, it’s a phrase I’ve heard a lot in the last three years from executives at conferences, industry events, client meetings and more. To me, it’s right up there with “All hands on deck!” and “Ask not what your country can do for you, ask what you can do for your country” (only in a less evocative, corporate-speak kind of way).
Yet during the most recent recession, the phrase has taken on a more urgent tone and seems to mean: “Demand stinks. Drop what you’re doing. Advocate for the company.” Particularly among already hyper-efficient companies, stimulating demand with a whole-company response may just be one recipe for retaining existing jobs and creating new ones.
But are employees responding? Forrester’s most recent Workforce Forrsights survey suggests few actually heed this executive call to action.
As an extension of our Empowered research series looking at employee empowerment, we decided to measure employee advocacy by borrowing the methodology of Net Promoter. We surveyed over 5,000 information workers across five countries: United States, Canada, United Kingdom, France, and Germany. We included 18 different professions and multiple industries. In the report, "Do Your Employees Advocate For Your Company" we use two questions to measure employee advocacy:
How likely are you to recommend your company’s products or services to a friend or family member?
How likely are you to recommend a job at your company to a friend or family member?