In November, Forrester released its mobile predictions for 2016, highlighting how mobile will act as a catalyst for business transformation and explaining why the battle for mobile moments will redefine the vendor landscape.
Let’s now take a closer look at how mobile will impact marketing in 2016.
A year ago, Forrester argued that most brands would underinvest in mobile in 2015. This is likely to remain the case this year, since too many marketers still have a narrow view of mobile as a “sub-digital” medium and channel. This is good news for the 20% of marketers who told us they have the budget they need and for the 33% who said they know how to measure mobile ROI. In 2016, this growing minority of leading marketers will start to fully integrate mobile into their marketing strategies. These mature mobile marketers will measure the impact of mobile across channels, see a clear opportunity to differentiate their brands, and increase their investments in mobile initiatives. Here’s what else we expect to happen:
Integrating mobile into your marketing strategy will become a key differentiator. While most brands are trying to mobilize their ads, few are going the extra mile: serving their customers in their mobile moments by transforming the entire customer experience. Only those that do go that extra mile will differentiate their brands via mobile. Leaders will also start measuring the impact of mobile on offline channels and will end up allocating up to 20% of their marketing budgets to mobile.
As companies get serious about digital transformation, we see investments shifting toward extensible software platforms used to build and manage a differentiated customer experience. My colleague John McCarthy has an excellent slide describing what's happening:
Before, tech management spent most of its time and budget managing a set of monolithic enterprise applications and databases. With an addressable market of a finite number of networked PCs, spending on the front end was largely an afterthought.
Today, applications must scale to millions, if not billions of connected devices while retaining a rich and seamless user experience. Infrastructure, in turn, must flex to meet these new specs. Since complete overhauls of the back end are a nonstarter for large enterprises with 30-plus years of investments in mainframes and legacy server systems, new investments gear toward the intermediary software platforms that connect digital touchpoints with enterprise applications and transaction systems.
At Forrester, we’ve been working to quantify some of the most viable software categories that exemplify this shift. A shortlist below:
· API management solutions: US CAGR 2015-2020: 22%.
· Public cloud platforms: Global CAGR 2015-2020: 30%. (Note: We have a forecast update in the works that segments the market into subcategories.)
What’s taken artificial intelligence (AI) so long? We invented AI capabilities like first-order logical reasoning, natural-language processing, speech/voice/vision recognition, neural networks, machine-learning algorithms, and expert systems more than 30 years ago, but aside from a few marginal applications in business systems, AI hasn’t made much of a difference. The business doesn’t understand how or why it could make a difference; it thinks we can program anything, which is almost true. But there’s one thing we fail at programming: our own brain — we simply don’t know how it works.
What’s changed now? While some AI research still tries to simulate our brain or certain regions of it — and is frankly unlikely to deliver concrete results anytime soon — most of it now leverages a less human, but more effective, approach revolving around machine learning and smart integration with other AI capabilities.
What is machine learning? Simply put, sophisticated software algorithms that learn to do something on their own by repeated training using big data. In fact, big data is what’s making the difference in machine learning, along with great improvements in many of the above AI disciplines (see the AI market overview that I coauthored with Mike Gualtieri and Michele Goetz on why AI is better and consumable today). As a result, AI is undergoing a renaissance, developing new “cognitive” capabilities to help in our daily lives.
Software is getting smarter, thanks to predictive analytics, machine learning, and artificial intelligence (AI). Whereas the current generation of software is about enabling smarter decision-making for humans, we’re starting to see “invisible software" capable of performing tasks without human intervention.
One such example is x.ai, a software-based personal assistant that schedules meetings for you. With no user interface, you simply cc “Amy” on an email thread and she goes to work engaging with the recipient to find a date and optimal place to meet.
It’s not a perfectly automated system. AI trainers oversee Amy’s interactions and make adjustments on the fly. But over time, she becomes a great personal assistant who is sensitive to your meeting and communication preferences.
One can imagine Amy extending into new domains — taking on parts of sales/customer service operations or business processes like expense management and DevOps. Indeed, we’ll see a new generation of AI-powered apps, as predicted here.
I am just back from the first ever Cognitive Computing Forum organized by DATAVERSITY in San Jose, California. I am not new to artificial intelligence (AI), and was a software developer in the early days of AI when I was just out of university. Back then, if you worked in AI, you would be called a SW Knowledge Engineer, and you would use symbolic programming (LISP) and first order logic programming (Prolog) or predicate calculus (MRS) to develop “intelligent” programs. Lot’s of research was done on knowledge representation and tools to support knowledge based engineers in developing applications that by nature required heuristic problem solving. Heuristics are necessary when problems are undefined, non-linear and complex. Deciding which financial product you should buy based on your risk tolerance, amount you are willing to invest, and personal objectives is a typical problem we used to solve with AI.
Fast forward 25 years, and AI is back, has a new name, it is now called cognitive computing. An old friend of mine, who’s never left the field, says, “AI has never really gone away, but has undergone some major fundamental changes.” Perhaps it never really went away from labs, research and very nich business areas. The change, however, is heavily about the context: hardware and software scale related constraints are gone, and there’s tons of data/knowledge digitally available (ironically AI missed big data 25 years ago!). But this is not what I want to focus on.
At a CIO roundtable that Forrester held recently in Sydney, I presented one of my favourite slides (originally seen in a deck from my colleague Ted Schadler) about what has happened r.e. technology since January 2007 (a little over five years ago). The slide goes like this:
Source: Forrester Research, 2012
This makes me wonder: what the next five years will hold for us? Forecasts tend to be made assuming most things remain the same – and I bet in 2007 few people saw all of these changes coming… What unforeseen changes might we see?
Will the whole concept of the enterprise disappear as barriers to entry disappear across many market segments?
Will the next generation reject the “public persona” that is typical in the Facebook generation and perhaps return to “traditional values”?
How will markets respond to the aging consumer in nearly every economy?
How will environmental concerns play out in consumer and business technology purchases and deployments?
How will the changing face of cities change consumer behaviors and demands?
Will artificial intelligence (AI) technologies and capabilities completely redefine business?