I continue to believe that most consumers using an NFC device in 2012 will more likely use it for device-pairing or data-sharing purposes than for payments. Pairing NFC accessories and reading NFC smart tags will open up new opportunities. NFC will be a key technology for interacting with the world around you — and it is time to test it, as highlighted in this recent piece of research written by my colleague Anthony Mullen. There is an ongoing debate about bar codes’ potential replacement by NFC; I think both technologies serve different objectives and have different advantages but will continue to co-exist. Radio and optical technologies are converging, as highlighted by French startup Mobilead, which does a fantastic job of delivering a great branded experience mixing QR codes and NFC tags.
Microsoft recently announced that it will change to its European currency pricing policy from July 2012, and the effect could be a 20% price increase for UK customers. It didn’t publicize the change, preferring to let its resellers tell their customers as and when the change affects them, so I thought I’d tell my readers what you need to know. Firstly, here is some background. Most global software companies have one master price list in their home currency and reset price lists in other currencies every year or even every quarter using then-current exchange rates. Microsoft has always taken a different approach, having set €, £, and other prices in 2001 and continuing to use the same exchange rate ever since. There are pros and cons to this approach:
· Pro: local prices are stable and predictable. In contrast, € and £ prices from other US-based vendors may rise or fall by 20% from one year to the next as the currencies fluctuate. (This is one reason why SAP’s revenue rises and Oracle’s falls when the € weakens against the $, as these price changes affect demand.)
· Con: European companies pay more than their US-based peers. This doesn’t matter so much if you’re only competing with domestic rivals, but global companies see and resent the discrepancies.
Employees that use smart devices — PCs or mobile devices — for work have expanded their use of technology more than most people realize. How many devices do you think a typical information worker uses for work? If you only ask the IT staff, the answer will be that most use just a PC, some use a smartphone, and a few use a tablet. But our latest Forrsights workforce employee survey asked more than 9,900 information workers in 17 countries about all of the devices they use for work, including personal devices they use for work purposes. It turns out that they use an average of about 2.3 devices.
About 74% of the information workers in our survey used two or more devices for work — and 52% used three or more! This means that the typical information worker has to figure out how to manage their information from more than one device. So they’ll be increasingly interested in work systems and personal cloud services that enable easy multidevice access, such as Dropbox, Box, SugarSync, Google Docs/Apps, Windows Live, and Apple iCloud.
When you dig into the data, the mix of devices info workers use for work is different than what IT provides. About 25% are mobile devices, not PCs, and 33% use operating systems other than Microsoft.
My blog post Apple Infiltrates The Enterprise: 1/5 Of Global Info Workers Use Apple Products For Work! got lots of visibility because of how hot Apple is right now, but our data is much broader than just Apple. Our Forrsights Workforce and Hardware surveys have lots more data about all types of PCs and smart devices that information workers use for work, including types of operating systems — and we even know about what personal-only devices they have.
For example, as of the fall of 2011, the top three smartphone OSes have essentially the same share of the installed base of smartphones used for work by information workers across the globe (full-time workers in companies with 20 or employees who use a PC, tablet, or smartphone for work one hour or more per day). See the chart below and the reference in the Monday, January 30, New York Times article on Blackberry in Europe.
The proposed acquisitions of SuccessFactors by SAP, and of Emptoris by IBM got me thinking about the impact on buyers of market consolidation, in respect of the difference between dealing with independent specialists versus technology giants selling a large portfolio of products and services. Sourcing professionals talk about wanting “one throat to choke,” but personally I’ve never met one with hands big enough to get round the neck of a huge vendor such as IBM or Oracle. Moreover, many of the giants organize their sales teams by product line, to ensure they fully understand the product they are selling, rather than giving customers one account manager for the whole portfolio who may not understand any of it in sufficient depth. Our clients complain about having to deal with just as many reps as before the acquisitions. They all now have the same logo on their business card, but can’t fix problems outside their area, nor negotiate based on the complete relationship. It seems that buyers end up like Hercules, wrestling either with a Nemean lion or with a Lernaean hydra.
The acquirers' press releases tend to take it for granted that customers will be better off with the one-stop shop. Bill McDermott, co-CEO of SAP, said, “Together, SAP and SuccessFactors will create tremendous business value for customers.” While Lars Dalgaard, founder and CEO of SuccessFactors, talks about “expanding relationships with SAP’s 176,000 customers.” Craig Hayman, general manager of industry solutions at IBM, said, “Adding Emptoris strengthens the comprehensive capabilities we deliver and enables IBM to meet the specific needs of chief procurement officers."
Today HP announced a new set of technology programs and future products designed to move x86 server technology for both Windows and Linux more fully into the realm of truly mission-critical computing. My interpretation of these moves is that it is both a combined defensive and pro-active offensive action on HP’s part that will both protect them as their Itanium/HP-UX portfolio slowly declines as well as offer attractive and potentially unique options for both current and future customers who want to deploy increasingly critical services on x86 platforms.
Bearing in mind that the earliest of these elements will not be in place until approximately mid-2012, the key elements that HP is currently disclosing are:
ServiceGuard for Linux – This is a big win for Linux users on HP, and removes a major operational and architectural hurdle for HP-UX migrations. ServiceGuard is a highly regarded clustering and HA facility on HP-UX, and includes many features for local and geographically distributed HA. The lack of ServiceGuard is often cited as a risk in HP-UX migrations. The availability of ServiceGuard by mid-2012 will remove yet another barrier to smooth migration from HP-UX to Linux, and will help make sure that HP retains the business as it migrates from HP-UX.
Analysis engine for x86 – Analysis engine is internal software that provides system diagnostics, predictive failure analysis and self-repair on HP-UX systems. With an uncommitted delivery date, HP will port this to selected x86 servers. My guess is that since the analysis engine probably requires some level of hardware assist, the analysis engine will be paired with the next item on the list…
HP made the right decision today to keep the Personal Systems Group. Beyond the reasons cited, supply chain and sales synergy and expense of spinning out, it's also crucial for HP to remain in the market for personal devices, which is entering a period of radical transformation and opportunity. The innovations spawned first by RIM with the BlackBerry, followed by the transformative effects of Apple's iPhone and iPad are beginning to ripple into the PC market. Apple's MacBook Air and Lion operating system, combined with Microsoft's Metro interface for Windows 8 herald the beginning of a transformation of personal computing devices. By keeping PSG, HP has the opportunity to innovate and differentiate in the PC market that will move away from commodity patterns.
For vendor strategists at vendors of all sizes, one of the lessons of HP's decision is that consumer businesses are becoming more relevant to succeeding in commercial products for end users. During the announcement call today, CEO Meg Whitman talked about the importance of "consumerization" in winning business from enterprises. I heartily endorse that view and look forward to sharing a report soon on how consumerization is changing commercial product development.
Do you think consumerization was a part of why HP kept PCs?
What effect do you think consumerization will have in IT markets?
I just spent several days at Dell World, and came away with the impression of a company that is really trying to change its image. Old Dell was boxes, discounts and low cost supply chain. New Dell is applications, solution, cloud (now there’s a surprise!) and investments in software and integration. OK, good image, but what’s the reality? All in all, I think they are telling the truth about their intentions, and their investments continue to be aligned with these intentions.
As I wrote about a year ago, Dell seems to be intent on climbing up the enterprise food chain. It’s investment in several major acquisitions, including Perot Systems for services and a string of advanced storage, network and virtual infrastructure solution providers has kept the momentum going, and the products have been following to market. At the same time I see solid signs of continued investment in underlying hardware, and their status as he #1 x86 server vendor in N. America and #2 World-Wide remains an indication of their ongoing success in their traditional niches. While Dell is not a household name in vertical solutions, they have competent offerings in health care, education and trading, and several of the initiatives I mentioned last year are definitely further along and more mature, including continued refinement of their VIS offerings and deep integration of their much-improved DRAC systems management software into mainstream management consoles from VMware and Microsoft.
As soon as you think you understand software companies’ policies on virtualization, a new problem appears that makes you tear your hair out and scratch your now-bald head. This month’s conundrum is whether or not VMware’s ThinApp product breaches your Microsoft Windows license agreement:
However, Microsoft, via its knowledge base, claims that “Running multiple versions of Windows Internet Explorer, or portions of Windows Internet Explorer, on a single instance of Windows is an unlicensed and unsupported solution.” http://support.microsoft.com/kb/2020599/en-us#top
VMware doesn’t warn customers that ThinApp could cause them Microsoft licensing problems, but neither does it claim that it is legal. It merely advises customers to check with Microsoft.