The Indian government announced its 2012-2013 budget on March 16, 2012. While the announced budget does not contain direct incentives to promote the domestic ICT industry, there will be adequate indirect opportunities for vendors to explore. The excise duty will increase from 10% to 12%; this will have a marginal impact on the sale of PCs (desktops, laptops, and tablets), but the government’s focus on improving infrastructure, creating efficient delivery mechanisms, and improving e-governance will provide substantial indirect opportunities to IT vendors.
The latest budget aims to achieve long-term and inclusive growth for the economy and is in sync with my upcoming report, “India’s 12th National Five-Year Plan (2012-2017) Provides Massive ICT Opportunities.” The report answers questions such as why and how technology will act as a key enabler for the Indian government to achieve its growth target.
The 2012-2013 budget will provide adequate ICT opportunities for vendors, such as:
Packaged and industry-specific applications, e-governance, mobile apps, and analytics will support the strong need for sustainable revenue sources to fund investments. A common problem that India faces today is the significant imbalance between expenditures and revenues. The budget categorically highlights the need to deliver more with existing resources; we will witness increased demand for packaged and industry-specific applications, e-governance, and mobile apps to help generate sustainable revenue to fund investments. Also, the outlay for e-governance projects will increase by 210%, from the equivalent of US$62 million to US$192 million; applications from software vendors for e-governance initiatives will present some of the most exciting opportunities in India. And the government will use various analytical tools to improve revenue sources and take corrective actions by identifying gaps.
Join us at Forrester’s CIO Forum in Las Vegas on May 3 and 4 for “The New Age Of Business Intelligence.”
The amount of data is growing at tremendous speed — inside and outside of companies’ firewalls. Last year we did hit approximately 1 zettabyte (1 trillion gigabytes) of data in the public Web, and the speed by which new data is created continues to accelerate, including unstructured data in the form of text, semistructured data from M2M communication, and structured data in transactional business applications.
Fortunately, our technical capabilities to collect, store, analyze, and distribute data have also been growing at a tremendous speed. Reports that used to run for many hours now complete within seconds using new solutions like SAP’s HANA or other tailored appliances. Suddenly, a whole new world of data has become available to the CIO and his business peers, and the question is no longer if companies should expand their data/information management footprint and capabilities but rather how and where to start with. Forrester’s recent Strategic Planning Forrsights For CIOs data shows that 42% of all companies are planning an information/data project in 2012, more than for any other application segment — including collaboration tools, CRM, or ERP.
My colleagues and I have just completed yet another engagement with a large client — one of dozens recently — who was facing a to be or not to be decision: whether to move its BI platform and applications to the cloud. It’s a very typical question that our clients are asking these days, mainly for the following two reasons:
In many cases, their current on-premises BI solutions are too inflexible to support the business now, much less in the future.
The relative success of cloud-based CRM (SFDC and others) solutions may indicate that cloud offers a better alternative.
These clients put these two statements together and make the reasonable assumption that cloud BI will solve many of the current BI challenges that cloud-based CRM solved. Reasonable? Yes. Correct? Not so fast — the only correct answer is “It depends.”
Let’s take a couple of steps back. First, let’s define applications or packaged solutions vs. platforms (because BI requires both).
Subscribe to a solution-like CRM
Provide standard business functions to all customers (which makes it different from “hosting;” see below)
Difficult to tailor to specific needs
Usually are used synonymously (but incorrectly, see below) with software-as-a-service (SaaS)
Platforms for building solutions
Subscribe to tools and resources to build solutions like CRM
Provide standard technical functions to developers
Contain limited, if any, business application functionality
Usually labeled either as platform-as-a-service (PaaS) or infrastructure-as-a-service (IaaS).
It seems everyone’s obsessed with Facebook’s IPO right now. And while CMOs are beginning to understand the possibilities of Facebook, and other social technologies, to connect and engage with customers, many CIOs remain unclear on the value of Facebook.
A question many business executives ask is this: “What’s the value of having someone like your page?”
On its own, maybe not much. But the true potential lies in the ability to collect insights about the people who like brands, products or services – be it your own or someone else’s.
For example, the chart below shows the percentage of consumers by age group who have “liked” Pepsi or Coca-Cola. These data suggest Coca-Cola is significantly more popular with 17-28 year olds than Pepsi, while Pepsi appears more popular with the 36-70 crowd. I pulled these data points directly from the Facebook likes of each of the brand pages using a free consumer tool from MicroStrategy called Wisdom. Using this tool I can even tell that Coca-Cola fans are likely to also enjoy the odd Oreo cookie and bag of Pringles.
As one of the industry-renowned data visualization experts Edward Tufte once said, “The world is complex, dynamic, multidimensional; the paper is static, flat. How are we to represent the rich visual world of experience and measurement on mere flatland?” There’s indeed just too much information out there to be effectively analyzed by all categories of knowledge workers. More often than not, traditional tabular row-and-column reports do not paint the whole picture or — even worse — can lead an analyst to a wrong conclusion. There are multiple reasons to use data visualization; the three main ones are that one:
Cannot see a pattern without data visualization. Simply seeing numbers on a grid often does not tell the whole story; in the worst case, it can even lead one to a wrong conclusion. This is best demonstrated by Anscombe’s quartet, where four seemingly similar groups of x and y coordinates reveal very different patterns when represented in a graph.
Cannot fit all of the necessary data points onto a single screen. Even with the smallest reasonably readable font, single line spacing, and no grid, one cannot realistically fit more than a few thousand data points using numerical information only. When using advanced data visualization techniques, one can fit tens of thousands data points onto a single screen — a difference of an order of magnitude. In The Visual Display of Quantitative Information, Edward Tufte gives an example of more than 21,000 data points effectively displayed on a US map that fits onto a single screen.
Demands by users of business intelligence (BI) applications to "just get it done" are turning typical BI relationships, such as business/IT alignment and the roles that traditional and next-generation BI technologies play, upside down. As business users demand more control over BI applications, IT is losing its once-exclusive control over BI platforms, tools, and applications. It's no longer business as usual: For example, organizations are supplementing previously unshakable pillars of BI, such as tightly controlled relational databases, with alternative platforms. Forrester recommends that business and IT professionals responsible for BI understand and start embracing some of the latest BI trends — or risk falling behind.
Traditional BI approaches often fall short for the two following reasons (among many others):
BI hasn't fully empowered information workers, who still largely depend on IT
BI platforms, tools and applications aren't agile enough
Emerging ARM server Calxeda has been hinting for some time that they had a significant partnership announcement in the works, and while we didn’t necessarily not believe them, we hear a lot of claims from startups telling us to “stay tuned” for something big. Sometimes they pan out, sometimes they simply go away. But this morning Calxeda surpassed our expectations by unveiling just one major systems partner – but it just happens to be Hewlett Packard, which dominates the WW market for x86 servers.
At its core (unintended but not bad pun), the HP Hyperscale business unit Project Moonshot and Calxeda’s server technology are about improving the efficiency of web and cloud workloads, and promises improvements in excess of 90% in power efficiency and similar improvements in physical density compared with current x86 solutions. As I noted in my first post on ARM servers and other documents, even if these estimates turn out to be exaggerated, there is still a generous window within which to do much, much, better than current technologies. And workloads (such as memcache, Hadoop, static web servers) will be selected for their fit to this new platform, so the workloads that run on these new platforms will potentially come close to the cases quoted by HP and Calxeda.
Marketing mix modeling solutions have been around for quite some time, providing marketers in several key categories with complex statistical models that aim to find the correlation between past marketing activities and business outcomes, like sales or market share.
However this space has recently seen significant changes, due to a few specific dynamics:
The proliferation of digital and social media with increasing importance in the marketing mix.
Marketers' increased demand for tools that are not only able to deliver insights on past campaigns but also able to give forward-looking recommendations on how to improve marketing return on investment (ROI) in the future.
The rising role that sophisticated software plays in integrating the ever-growing number of data streams and in enabling complex analysis to be navigated and customized via powerful graphic user interfaces.
To help navigate this complex and highly relevant space for senior marketers, our research team has published the first Forrester Wave™ for vendors in the marketing mix modeling space. We screened more than 30 vendors, shortlisted six that we consider to be the key players in this very fragmented market, and ranked them according to more than 40 different criteria. The evaluation uncovered a market in which:
MarketShare, Marketing Management Analytics, and ThinkVine lead the pack.
SymphonyIRI is a Leader but lacks collaborative functionalitites.
Marketing Analytics and Ninah are competitive Strong Performers.
Yesterday, HP agreed to buy UK software firm Autonomy Corp. for $10 billion to move into the enterprise information management (EIM) software business. HP wants to add IP to its portfolio, build next-generation information platforms, and create a vehicle for services. It is following IBM’s strategy of acquiring software to sell to accompany its hardware and services. With Autonomy under its wing, HP plans to help enterprises with a big, complicated problem – how to manage unstructured information for competitive advantage. Here’s the wrinkle – Autonomy hasn’t solved that problem. In fact, it’s not a pure technology problem because content is so different than data. It’s a people, process problem, too.
Here is the Autonomy overview that HP gave investors yesterday:
Of course, this diagram doesn’t look like the heterogeneous environment of a typical multinational enterprise. Autonomy has acquired many companies to fill in the boxes here, but the reality is that companies have products from a smorgasbord of content management vendors but no incentive to stick with any one of them.
There has been a great deal of talk over the past few years about what acronym will replace WCM (web content management). Web experience management? Web site management? Web engagement management? Web experience optimization? The list goes on and on.
Certainly, the evolution of the WCM term makes sense on paper, since traditional content management functionality now only makes up a portion of the products that WCM vendors now offer. WCM vendors are also in the content delivery/engagement business, and are even dipping their toes into web intelligence. However, Forrester clients still overwhelmingly ask about “WCM” and that term isn’t going away any time soon.
But even without changing the acronym, it is time to start thinking about WCM beyond just managing content or siloed websites or experiences. Instead, we need to think of how WCM will interact and integrate with other solutions – like search, recommendations, eCommerce, and analytics – in the customer experience management (CXM) ecosystem in order to enable businesses to manage experiences across customer touchpoints.
How are we handling this convergence at Forrester? Several of us who cover various CXM products – like Brian Walker (commerce), Bill Band (CRM), Joe Stanhope (web analytics), and myself (WCM) – teamed up to outline what our vision of CXM looks like, including process-based tools, delivery platforms, and customer intelligence. We've created two versions of the report: one written for Content & Collaboration professionals and one for eBusiness & Channel Strategy professionals.