New Year’s Eve is the time for looking back at the past year before preparing for the next on New Year’s Day. So, I’m taking the time before the festivities tonight to take stock of 2011 and put down my thoughts on what were the top 10 events in the tech world. This is one person’s opinion, so feel free to voice your own counterpoints.
In reverse order (and with apologies to David Letterman):
10. Microsoft’s acquisition of Skype. I’m still not clear about how Microsoft is going to use Skype, but Skype’s expanding role as a platform for person-to-person videochats may make this one of Microsoft’s better acquisitions.
9. IBM’s Watson wins Jeopardy!, setting stage for creating deep analytical solutions for other business problems. The average person doesn’t understand technology. But many people follow the Jeopardy! game show on TV. By developing an artificial intelligence system that could successfully beat the best human contestants in Jeopardy and giving it the human name of Watson, IBM did a brilliant job of showing its technologies’ potential in a way the average person could understand. More importantly, it has followed up by building new Watson-based solutions for healthcare diagnostics, financial services risk management, and other business situations.
8. Microsoft/Nokia partnership for Nokia to adopt the Microsoft Phone operating system for its smartphones. Both Microsoft and Nokia have struggled in keeping up with Apple and Google in the smartphone market. By combining forces, they gave themselves another chance to become a credible third option in the smartphone market.
Few would dispute that cloud computing has a huge potential for making IT service expenditures more cost-effective and flexible. But as is often the case, what is now possible is not necessarily practical or even desirable from the standpoint of the buying customer in terms of both accommodating longstanding preferences as well as specific contractual terms.
For example, consider these aspects of cloud computing:
Variable pricing means unpredictable in spending. One of the lessons of the early utility models of the early 2000s was that customers’ preference for predictable expenditures often trumped variability based on consumption. The same is true today with even more inherently fungible cloud services. Moreover, a sudden, wholesale shift from capital spending to expense spending is impractical for many customers.
Rapid provisioning taxes customer lead times. Rapid provisioning, one of cloud computing’s principal calling cards, presents huge advantages compared to server provisioning times measured in months, but customer provisioning systems cannot usually take full advantage of provisioning times measured in mere minutes.
Pricing based on resource units can bring challenges. For example, testing-as-a-service allows customers to pay on the basis of test cases executed, but few customers are as yet ready or comfortable paying in this manner.
Reading the recent Harvard Business Review article from Tom Davenport et al., it occurred to me that next best offer (NBO) is actually a subset of what my colleague Jim Kobielus calls “next best action” (NBA). And when you couple that predictive thinking with advances in process mining (see Wil van der Aalst’s post and the Process Mining Manifesto), it clearly becomes possible to optimize operations dynamically on the fly. First of all, the organization could mine the existing system (the transaction logs of traditional systems or a newly implemented BPM/CRM system) to identify what happens today. This then enables you to identify the outcomes that are most interesting (or those you want to achieve) and then optimize the NBA accordingly.
We take for granted a process definition where the next action is predetermined by the arc of the process definition. But if we can do NBO in 200 milliseconds, we can also do NBA in a similar time frame. Directed arcs in process models and the business rules that go with them start to become a little redundant. This sort of combination (mining and NBA) enables wide-open goal-oriented optimization for all sorts of processes, not just those related to marketing and cross-sell/upsell ideas.
When I moved to India about two years ago, I arrived with my own expectations regarding emerging markets. One of them was that the lack of legacy IT applications and infrastructure would make these markets an ideal place for new technologies and delivery models like as-a-service to thrive. In other words, organizations in emerging markets would “leapfrog” to new technologies without going through some of the prior technology investments witnessed in developed markets. Unfortunately, the reality is not that simple.
One of the key takeaways of my recent reports (Australia, China, India Set The Pace For Asian IT Services and The Changing Face Of ASEAN IT Services — to be published in January 2012) is that most of the growth in emerging countries will come from traditional IT services such as ERP implementation, infrastructure deployment, and system integration. Against common belief, emerging services — including cloud and mobility — will represent less than 20% the total annual growth in emerging markets in 2015.
I see several reasons for this:
Lack of governance and planning. An IT department’s role is merely one of provider of applications and infrastructure, whose main objective is to react to business needs.
Lack of internal skills. Client organizations do not have the adequate skills internally to take on complex transformational projects involving new technologies such as virtualization, business analytics, and mobile enterprise application integration platforms.
Lack of IT services culture. Most client organizations in emerging markets leverage external skills to help them with basic tasks such as hardware maintenance and software deployment.
Yeah, the tune is playing in my head. Video Killed the Radio Star. But in this case, it's Apple's iMessage service that's killing the SMS cash cow. For those of you haven't experienced it yet, check out this picture.
It's my riding buddy Joe sending me a text message, or in this case, an iMessage. The blue box is the giveaway -- it came over Apple's texting service, not AT&T's SMS service. It's "free." That is, it travels over the Internet, not the SMS network, and it's free on Wi-Fi or included in my wireless data plan. And while I have unlimited texting, I do pay $30/month for the family plan, about $0.10/message last month. (I know, some of you text so much that it's probably a penny a message or less.)
So, let's do the math:
100 million iOS users.
Sending 50 messages a month to another iOS user. (iOS users move in packs.)
Each person pays for the SMS message, so that's 100 messages per person.
Each SMS message costs (let's say) $0.05.
So 100,000,000 iOS users x 100 iMessages/month x $0.05/message = $500,000,000/month.
Said another way, that's $6B taken out of the SMS value chain by the iOS iMessage service every year. Then there's the BlackBerry Messenger service for inter-BlackBerry messages. And the Magic SMS app for iPhone and Android. And probably a hundred other SMS alternatives that I'll never know about. Add it all up, and 10 billion dollars in SMS value (not revenue) could be siphoned off to the wireless data market in 2013.
In the spirit of the somewhat overstated movie advertisement: “If you only read one of my blogs this year, read this one”; although I prefer the version for The Naked Gun 2 1/2: “If you only see one movie this year... you should get out more often.”
Anyway, so many blogs, so little time; and what did I say that was important (if only in my own tiny mind)? Each bullet links through to the original blog.
The machines created this mess; let them clean it up.
On the one hand, enterprises need to make ever more content available in multiple languages. As I noted in my last post on translation, the drivers include the flood of content generated online (much of it created by consumers), the growing importance of business in emerging markets, and the desire to enable global collaboration among employees. On the other hand, advances in machine translation and new approaches such as crowdsourcing are making translation ever faster and less expensive. This is no fortunate coincidence: The very computing dynamics that enabled the Web and especially Web 2.0 -- rapid increases in processor speed, cheap storage, and high-speed networks, combined with social technologies -- also empower the latest technology-based solutions to translation and localization.
What it means (WIM): Computers have allowed us to create a problem that only computers can help solve.
This is the first of an irregular series of blog posts on how technical advances, new solution paradigms, and evolving client needs are changing translation services and providers (TSPs). I'll begin by offering a select glossary of some of the unfamiliar terms end users encounter when they begin to investigate translation services.
MT: Machine Translation, which simply means the use of computing technologies and software to assist with the translation of content (usually text, but voice recognition is of growing importance) from one language ("the source") to another ("the target"). Machine translation takes two primary forms, namely:
Oracle yesterday reported surprisingly weak results for its fiscal quarter ending on November 30 (see December 20, 2011, "Oracle Reports Q2 GAAP EPS Up 17% to 43 Cents; Q2 Non-GAAP EPS up 6% to 54 Cents"), with total revenues up just 2%, software revenues up 7%, hardware revenues down 10%, and services revenues flat. Even worse, hardware product sales were down 14%, new software license revenues rose just 2%, and license revenues for Oracle applications actually fell by 4%. Oracle had set expectations for revenue growth of 5% to 15%, and most financial analysts had projected growth at the high end of that range, based on Oracle's license revenues in prior quarters growing by 22% to 34% for applications, and 14% to 27% for database and middleware revenues. Oracle attributed the shortfall in revenues to potential deals that failed to close by the end of the quarter due to buyer caution.
For the tech sector, this is a worrisome report. Oracle's software revenues had been consistently stronger than the overall tech market, growing by 17% in US dollars in the prior quarters in 2011. If Oracle's software revenue growth slips to 7%, does that imply that the rest of the tech market is going to see little or no growth in Q4 2011?
Next best action is the proving ground for advanced analytics and big data; it’s also the infrastructure that provides analytics- and rule-driven guidance across one or more customer-facing touchpoints. You can find next best action at the heart of multichannel customer relationship management (CRM) initiatives everywhere. It’s even present in a growing range of back-office business processes such as order fulfillment and supply chain management.
Next best action will continue to develop as an overarching business technology initiative for many companies in the coming year. The market is emerging and is becoming aware of itself as a substantial new niche, in much the same way that the Hadoop market flowered in the past year.
Here are some of the highlights that Forrester anticipates in the next best action arena in 2012:
The next-best-action market will continue to coalesce around core solution capabilities. Traditionally, next best action has been a capability embedded in your customer service, marketing, and other CRM applications. That remains the heart of the next-best-action solution market. However, the past several years have seen the development of a niche for next-best-action standalone infrastructure that you may deploy in conjunction with various CRM and back-office applications. In 2012, we will see more vendors converge on the next-best-action arena from various backgrounds, including predictive analytics, business process management (BPM), business rules management (BRM), complex event processing (CEP), decision automation, recommendation engine, and social graph analysis. Many established vendors will repackage and reposition their offerings in these segments under the banner of next best action in order to address hot new solution areas, including multichannel offer targeting, marketing campaign automation, and customer experience optimization.
It does not come as a real surprise that the deal aimed at merging AT&T's and Deutsche Telekom's US wireless operations got nowhere. We were expecting as much back in autumn. In our view, there are no winners as a result of this dropped deal, not even the US consumer. The US consumer can look forward to poorer network infrastructure and a weakened T-Mobile as the low-end market provider. Hence, the Federal Communications Commission and Justice Department attained somewhat of a Pyrrhic victory.
Whilst the collapsed deal is a major irritant for AT&T, it is a disaster for Deutsche Telekom, as it leaves T-Mobile US in a very difficult position. With about 10% of the US wireless subscribers, T-Mobile US remains subscale. Its image is increasingly trending toward cheap rather than good value, given its patchy network coverage, especially in rural areas.
The reluctance by Deutsche Telekom to prepare for a "no-deal scenario" leaves T-Mobile without a clear strategy. This lack of direction is very risky and only pushes T-Mobile further down a slippery slope toward increasing churn and revenue and margin challenges. Deutsche Telekom needs to communicate its plans for 4G roll-out, spectrum purchases, partnerships for network sharing, and device portfolio. Above all, Deutsche Telekom needs to decide soon whether to pursue an IPO, a sale to another operator or a financial investor, or target a merger with the likes of Dish, Leap, Clearwire, Sprint, or even LightSquared. Ultimately, we expect Deutsche Telekom to opt for a merger scenario.