Over the last couple of months I’ve seen a real up-tick in inquiries from services procurement and vendor management teams asking about co-employment risk. This is partly due to concern that the U.S. government, in search of new sources of tax revenue, will double down its efforts to identify large companies who are potentially misclassifying workers.
The concerns have been exacerbated by software and consulting vendors over hyping co-employment risks as an avenue to sell vendor management systems and managed services. That’s not to say that software and services can’t help, but I do get the sense that the issue at hand may not necessarily merit the attention it gets in vendor teleconferences and white papers.
Business-IT alignment is one of those persistent "Top 3" CIO issues. It has been this way just about as long as I’ve been in IT. You would think this would have been solved by now. After all, you put in business-driven IT governance, relationship managers, and some really nice dashboard, and you’ve covered about 90% of the advice out there. I’m going to suggest that business-IT alignment is being held hostage by complexity. Not technology complexity, since business leaders seem to be coming to terms with that. And not the mind-numbing spaghetti charts that show how complex our application and infrastructure landscapes are. They don’t understand these charts, but since we don’t understand them either, we can hardly expect business execs to. The complexity I’m referring to lies between their goals and the "stuff" IT delivers. They don’t see the connection. And since we see business execs having lots of goals, which shift over time, and strategies that also shift, we can’t show the connection. Instead, we say, "This is what you asked for, and this is what we delivered."
This week is Interop Las Vegas 2010, arguably the largest industry conference in North America targeting IT professionals. While the event has its roots in networking, today’s Interop has 13 tracks ranging from cloud computing and virtualization, to mobility and video conferencing, to governance, risk, and compliance. I’ve had the pleasure of chairing the data center and green IT tracks at the last three Interop Las Vegas and New York events.
Don’t have the opportunity to be at Interop in person? Forrester has you covered…
Fellow Forrester analyst, Rachel Dines, and I are onsite at Interop and we will be posting the key takeaways for IT Infrastructure & Operations (I&O) professionals here on Forrester’s I&O blog. We encourage you to check the blog over the next few days for Forrester’s insights on the following data center and green IT sessions:
I'll admit to spending only 3 hours on the show floor. Most was spent in the cavernous and gloomy AIIM sessions area where I gave an "Analyst Take" session on SharePoint 2010, a talk on Dynamic Case Management, and reviewed suppliers for Document output for Customer Communications. My impression of the floor activity was an improvement over the last two years. Perhaps contraction of sponsorships had hit the right balance with demand, or perhaps the great spring weather and improving economy were at work, but the mood was upbeat and the crowds were steady. Vendors were grumbling less. Cloud talk and SaaS were under-represented. E-discovery and records management were in line. And the usual interesting collection of arcane conversion, migration, capture, and other providers - usually in the lower rent districts - continued the tradition. SharePoint was again pervasive. Those that say "that ship has come in" may not be aware of other ports and forms of transportation. One wonders what the future of the show is if the SharePoint sessions are the biggest draw and Microsoft and key partners have the biggest booths. Philly is a city that has lost its major corporate headquarters and no longer has growth industries - but it does not deserve its reputation. The AIIM show - with roots in microfilm and paper - is similar - and likewise - is still pretty good.
Product strategists struggle with the issue of value all the time: What constitutes a revenue-maximizing price for my product, given the audience I’m targeting, the competition I’m trying to beat, the channel for purchase, and the product’s overall value proposition?
There are tools like conjoint analysis that can help product strategists test price directly via consumer research. However, there’s a bigger strategic question in the background: How can companies create and sustain consistently higher prices than their key competitors over the long term?
The Mac represents a good case study for this business problem. Macs have long earned a premium over comparable Windows PCs. Though prices for Macs have come down over time, they remain relatively more expensive, on average, than Windows-based PCs. In fact, they’ve successfully cornered the market on higher-end PCs: According to companies that track the supply side, perhaps 90% of PCs that sold for over $1,000 in Q4, 2009 were Macs.
Macs share common characteristics with Windows PCs on the hardware front – ever since Apple switched to Intel processors about four years ago, they’ve had comparable physical elements. But the relative pricing for Macs has remained advantageous to Apple. At the same time, the Mac has gained market share and is bringing new consumers into the Mac family – for example, about half of consumers who bought their Mac in an Apple Store in Q1, 2010 were new to the Mac platform. So Apple is doing something right here – providing value to consumers to make them willing to pay more.
Organizations of all sizes have a growing number of traveling employees, some or many of whom travel overseas. IT should learn from the recent Icelandic volcanic eruption which resulted in many airport shutdowns across northern and central Europe for several days. Hundreds of thousands of traveling iWorkers were stranded, unable to reach their travel destination or even return home. The fortunate ones were those who’d been
There’s a lot of hype out there by many vendors who claim that they have tools and technologies to enable BI end user self service. Do they? When you analyze whether your BI vendor can support end user self service, consider the following types of “self service” and related BI tool requirements:
#1. Self service for average, casual users.
What do these users need to do?
Run and lightly customize canned reports and dashboards
Run ad hoc queries
Add calculated measures
Fulfill their BI requirements with little or no training (typically one needs search-like, not point-and-click UI for this)
What capabilities do they need for this?
Report and dashboard templates
Customizable prompts, sorts, filters, and ranks
Report, query, dashboard building wizards
Semantic layer (not all BI vendor have a rich semantic layer)
Prompting for columns (not all BI vendors let you do that)
Drill anywhere (only BI vendors with ROLAP and multisourcing / data federation provide this capability)
#2. Self service for advanced, power users
What do these users need to do?
Perform what-if scenarios (this often requires write back, which very few BI vendors allow)
Add metrics, measures, and hierarchies not supported by the underlying data model (typically one needs some kind of in-memory analytics capability for this)
Explore based on new (not previously defined) entity relationships (typically one needs some kind of in-memory analytics capability for this)
Not knowing exactly what one is looking for (typically one needs search-like UI for this)
Over the past few months, I had the opportunity to interview representatives from 10 leading technology service providers about how they help their clients innovate. My recent research summarizing those interviews is available to Forrester clients on our website. For those interested in the high level points I raised, here are a few of the key findings:
Every year, I take 250 to 300 calls from Forrester clients. The vast majority of these calls are from executives embroiled in the process of trying to select the right CRM technology solution to support their business strategy. From these conversations, I have distilled a set of decision criteria to help you quickly cut through the CRM tech vendor underbrush.
Ability to meet your specific business requirements. You have to know what business outcomes you are trying to achieve, and define the business capabilities that you need to support, before you seriously consider investing in a CRM software solution. Although the core capabilities of leading CRM software vendors are quite similar, the companies I hear from still place a very high importance on the solution meeting the functional and technology criteria that are specific to their needs. Can the vendor meet your use-case requirements?
Ease of use for front-line workers. My clients expect CRM software to demonstrate the capability to make people more fruitful in their work, and this is predicated on how easy the solution is to use. Good usability encourages user adoption. Is the solution UI modern and adaptable to diverse role-based requirements?
Capability to provide advanced analytic abilities. My clients place a high value on CRM vendors' ability to provide analytic tools to better understand customer behavior and make insightful customer-facing decisions using the myriad customer data collected. Analytics are the key to unlocking the value in CRM applications. Does the vendor have powerful and easy-to-use business intelligence capabilities?
Last week I published two research reports on the hottest topic in PCI: Tokenization and Transaction Encryption. Part 1 was an introduction into the topic and Part 2 provided some action items for companies to consider during their evolution of these technologies. Respected security blogger, Martin McKeay, commented on Part 1. Serendipitously, Martin was also in Dallas (where I live) last week and we got an opportunity to chat in person about the report and other security topics.
Martin’s post highlighted several issues that deserve some response. He felt that I, “glossed over several important points people who are considering either technology need to be aware of.” Let me review those items:
Comment: “This is one form of tokenization, but it completely ignores another form of tokenization that’s been on the rise for several years; internal tokenization by the merchant with a (hopefully) highly secure database that acts as a central repository for the merchant’s cardholder data, while the remainder of the card flow stays the same as it is now.”