I recently had a meeting with executives from Tech Mahindra, an Indian-based IT services company, which was refreshing for the both the candor with which they discussed the overall mechanics of a support and integration model with significant components located half a world away, as well as their insights on the realities and limitations of automation, one of the hottest topics in IT operations today.
On the subject of the mechanics and process behind their global integration process, the eye opener for me was the depth of internal process behind the engagements. The common (possibly only common in my mind since I have had less exposure to these companies than some of my peers) mindset of “develop the specs, send them off and receive code back” is no longer even remotely possible. To perform a successful complex integration project takes a reliable set of processes that can link the efforts of the approximately 20 – 40% of the staff on-site with the client with the supporting teams back in India. Plus a massive investment in project management, development frameworks, and collaboration tools, a hallmark of all of the successful Indian service providers.
From a the client I&O group perspective, the relationship between the outsourcer and internal groups becomes much more than an arms-length process, but rather a tightly integrated team in which the main visible differentiator is who pays their salary rather than any strict team, task or function boundary. For the integrator, this is a strong positive, since it makes it difficult for the client to disengage, and gives the teams early knowledge of changes and new project opportunities. From the client side there are drawbacks and benefits – disengagement is difficult, but knowledge transfer is tightly integrated and efficient.
Over the past few years, IBM has certainly copped its fair share of criticism in the Asian media, particularly in Australia. Whether this criticism is deserved or not is beside the point. Perception is reality — and it’s led some companies and governments to exclude IBM from project bids and longer-term sourcing deals. On top of this, the firm’s recent earnings in Asia Pacific have disappointed.
But I’ve had the chance to spend some quality time with IBM at analyst events across Asia Pacific over the past 12 months, and it’s clear that the company does some things well — in fact, IBM is sometimes years ahead of the pack. For this reason, I advise clients that it would be detrimental to exclude IBM from a deal that may play to one of these strengths.
IBM’s value lies in the innovation and global best practices it can bring to deals; the capabilities coming out of IBM Labs and the resulting products, services, and capabilities continue to lead the industry. IBM is one of the few IT vendors whose R&D has struck the right balance between shorter-term business returns and longer-term big bets.
The app economy is blurring the lines and opening up new opportunities, with a lot of new entrants in the mobile space, be it with mobile CRM and analytics, store analytics, dedicated gaming analytics, etc. A bunch of players have raised more than $250+ million among the likes of Flurry, Urban Airship, Crittercism, Kontagent, Trademob, Apsalar, App Annie, and Localytics, to name a few. Expect a lot of innovation and acquisitions in that space once mobile is more naturally integrated into digital marketing strategies.
On average, mobile now represents more than 20% of overall traffic to websites. For some companies, including many in media, more than half of all visits come via mobile devices. In some countries, such as India, mobile has surpassed PC traffic. Marketers are integrating mobile as part of their marketing mix, but too many have not defined the metrics they’ll use to measure the success of their mobile initiatives. Many lack the tools they need to deeply analyze traffic and behaviors to optimize their performance.
Thirty-seven percent of marketers we surveyed do not have defined mobile objectives. For those who do, goals are not necessarily clearly defined, prioritized, and quantified. Half of marketers surveyed have neither defined key performance indicators nor implemented a mobile analytics solution! Most marketers consider mobile as a loyalty channel: a way to improve customer engagement and increase satisfaction. Marketers must define precisely what they expect their customers to do on their mobile websites or mobile apps, and what actions they would like customers to take, before tracking progress.
For the past ten years, the major IT initiative within Chinese organizations has been service oriented and/or process driven architecture. The pace of change has been slow for two reasons: 1) From an end user perspective, related business requirements are not clear or of high priority; 2) more importantly, solutions providers have not been ready to embrace technology innovation and meet emerging technology requirements through new business models.
Times are changing. IBM and other major ISV/SI in China (as well as end users) are driving momentum around emerging technology, such as cloud and enterprise mobility. I recently attended the IBM Technical Summit 2013 in Beijing from July 11 to 12. Here’s what I learned:
Telecom carriers supported by technology vendors will accelerate cloud adoption by SME. Contributing to more than 60% of total GDP in China, small and medium enterprises (SMEs) have always sought to simplify their IT operation as much as possible, and at the same time scale it up when business expands as quickly as possible. IaaS solutions appear to be a perfect match for SMEs; however IT professionals have concerns about the security and data privacy over the operations by other companies.
Yesterday Intel had a major press and analyst event in San Francisco to talk about their vision for the future of the data center, anchored on what has become in many eyes the virtuous cycle of future infrastructure demand – mobile devices and “the Internet of things” driving cloud resource consumption, which in turn spews out big data which spawns storage and the requirement for yet more computing to analyze it. As usual with these kinds of events from Intel, it was long on serious vision, and strong on strategic positioning but a bit parsimonious on actual future product information with a couple of interesting exceptions.
Content and Core Topics:
No major surprises on the underlying demand-side drivers. The the proliferation of mobile device, the impending Internet of Things and the mountains of big data that they generate will combine to continue to increase demand for cloud-resident infrastructure, particularly servers and storage, both of which present Intel with an opportunity to sell semiconductors. Needless to say, Intel laced their presentations with frequent reminders about who was the king of semiconductor manufacturingJ
I concluded my March 2013 report on the role of software assets in business innovation by proposing that “The combination of software assets, strong domain expertise, analytics, and as-a-service delivery models will increasingly allow traditional service providers to reinvent the way they deliver business value to their clients.” I was glad to hear that IBM recently announced a deal with L’Oréal that directly supports this position. The announced engagement actually includes all these components:
The procurement domain expertise of IBM Global Business Services addresses business pain points. L’Oréal USA grew rapidly over the past few years via an aggressive acquisition strategy that caused indirect procurement processes to remain highly disparate. The company knew that there was a significant gap between negotiated savings and realized savings in its indirect procurement operations. IBM GBS consultants brought strong procurement expertise to work with L’Oréal’s existing sourcing team to transform existing processes. IBM Global Process Services (GPS) category experts are working with L’Oréal to develop and implement category sourcing strategies.
IBM has just announced that one of Australia’s “big four” banks, the ANZ, will adopt the IBM Watson technology in their wealth management division for customer service and engagement. Australia has always been an early adopter of new technologies but I’d also like to think that we’re a little smarter and savvier than your average geek back in high school in 1982.
IBM’s Watson announcement is significant, not necessarily because of the sophistication of the Watson technology, but because of IBM's ability to successfully market the Watson concept.
To take us all back a little, the term ‘cognitive computing’ emerged in response to the failings of what was once termed ‘artificial intelligence’. Though the underlying concepts have been around for 50 years or more, AI remains a niche and specialist market with limited applications and a significant trail of failed or aborted projects. That’s not to say that we haven’t seen some sophisticated algorithmic based systems evolve. There’s already a good portfolio of large scale, deep analytic systems developed in the areas of fraud, risk, forensics, medicine, physics and more.
The industry is abuzz with speculation that IBM will sell its x86 server business to Lenovo. As usual, neither party is talking publicly, but at this point I’d give it a better than even chance, since usually these kind of rumors tend to be based on leaks of real discussions as opposed to being completely delusional fantasies. Usually.
So the obvious question then becomes “Huh?”, or, slightly more eloquently stated, “Why would they do something like that?”. Aside from the possibility that this might all be fantasy, two explanations come to mind:
1. IBM is crazy.
2. IBM is not crazy.
Of the two explanations, I’ll have to lean toward the latter, although we might be dealing with a bit of the “Hey, I’m the new CEO and I’m going to do something really dramatic today” syndrome. IBM sold its PC business to Lenovo to the tune of popular disbelief and dire predictions, and it's doing very well today because it transferred its investments and focus to higher margin business, like servers and services. Lenovo makes low-end servers today that it bootstrapped with IBM licensed technology, and IBM is finding it very hard to compete with Lenovo and other low-cost providers. Maybe the margins on its commodity server business have sunk below some critical internal benchmark for return on investment, and it believes that it can get a better return on its money elsewhere.
In his 1956 dystopian sci-fi novel “The City and the Stars”, Arthur C. Clarke puts forth the fundamental design tenet for making eternal machines, “A machine shall have no moving parts”. To someone from the 1950s current computers would appear to come close to that ideal – the CPUs and memory perform silent magic and can, with some ingenuity, be passively cooled, and invisible electronic signals carry information in and out of them to networks and … oops, to rotating disks, still with us after more than five decades[i]. But, as we all know, salvation has appeared on the horizon in the form of solid-state storage, so called flash storage (actually an idea of several decades standing as well, just not affordable until recently).
The initial substitution of flash for conventional storage yields immediate gratification in the form of lower power, maybe lower cost if used effectively, and higher performance, but the ripple effect benefits of flash can be even more pervasive. However, the implementation of the major architectural changes engendered across the whole IT stack by the use of flash is a difficult conceptual challenge for users and largely addressed only piecemeal by most vendors. Enter IBM and its Flashahead initiative.
What is Happening?
On Friday, April 11, IBM announced a major initiative, to the tune of a spending commitment of $1B, to accelerate the use of flash technology by means of three major programs:
· Fundamental flash R&D
· New storage products built on flash-only memory technology
With the next major spin of Intel server CPUs due later this year, HP’s customers have been waiting for HP’s next iteration of its core c-Class BladeSystem, which has been on the market for almost 7 years without any major changes to its basic architecture. IBM made a major enhancement to its BladeCenter architecture, replacing it with the new Pure Systems, and Cisco’s offering is new enough that it should last for at least another three years without a major architectural refresh, leaving HP customers to wonder when HP was going to introduce its next blade enclosure, and whether it would be compatible with current products.
At their partner conference this week, HP announced a range of enhancements to its blade product line that on combination represent a strong evolution of the current product while maintaining compatibility with current investments. This positioning is similar to what IBM did with its BladeCenter to BladeCenter-H upgrade, preserving current customer investment and extending the life of the current server and peripheral modules for several more years.
Tech Stuff – What Was Announced
Among the goodies announced on February 19 was an assortment of performance and functionality enhancements, including:
Platinum enclosure — The centerpiece of the announcement was the new c7000 Platinum enclosure, which boosts the speed of the midplane signal paths from 10 GHz to 14GHz, for an increase of 40% in raw bandwidth of the critical midplane, across which all of the enclosure I/O travels. In addition to the increased bandwidth midplane, the new enclosure incorporates location aware sensors and also doubles the available storage bandwidth.