Why More Is More

"Thank you for coming to this urgent, last-minute meeting today," I say to you and seven of our mutual colleagues. Pointing to a plate of 10 cookies on the table between us, I explain, "We're in luck, though. As a reward for your willingness to squeeze this meeting in, I have managed to sneak some cookies out of the executive meeting next door!"

We all smile, and that look on your face tells me that you already feel better about this meeting than you did just minutes ago. As I drone on about the urgent topic of the meeting, your mind does the math. Counting you and me, there are nine of us. It doesn't take much to figure out that there is at least one cookie for everyone in the room. Your brain signals a salivary response and depending on your current blood sugar levels, possibly a preemptive insulin release from the pancreas.

In other words, you begin to act like that cookie is yours. If I were to survey you at that moment about how appetizing the cookies seem and how much you expect to enjoy yours, you and the others might collectively estimate the cookies at 6.5 on a scale from 1 to 10. Good cookies, to be sure.

That's when a knock is heard at the door and someone enters — a confederate of mine, though you don't know this — who explains that they made a mistake in letting me take the cookies. In fact, the executives ran out of cookies, and they need eight of them back. I apologize, take two cookies from the plate and put them on a napkin for us to keep. My confederate leaves the room with the eight other cookies.

If I survey you now and ask you how appetizing the two cookies left on the table appear, what do you think happens to your estimate? If you guessed that your desire for the cookies goes up, you are in tune with human nature. Indeed, the average score for the cookies will be higher, coming in at more like 7.5, even though the cookies did not change.

Read more

By 2025, 50% Of Adults Under Age 32 Will Not Pay For TV

Title got your attention? It should. In a report I just published this week, I use our Forrester Consumer Technographics® data to identify the 7% of adults who are digital cord-nevers — measured as people who have never paid for TV and who are under age 32. This is the worrisome group whose arrival TV-industry pros have nervously anticipated. As we show in the report, they are officially now larger than the entire adult population of cord cutters, who come in at 6% of all adults. Put them together, and you have 15% of adults who are not paying for TV while still getting all the TV value they need from a combination of Netflix, Amazon Prime Video, and other tools. 

Don't jump out of any Times Square windows just yet. TV is still massively popular and will continue to be. I wrote that report earlier this year, and Forrester clients can read it here. These defector groups are going to grow over time, true. And as the title of this post suggests, if we model this behavior out over the next 10 years, we expect that 50% of adults under age 32 will not pay for TV, at least not the way we think of it today. That compares to 35% of that age group that doesn't pay for TV today. (That's right, a third of them are already out of the pay TV game.) 

Read more

Building Direct Digital Relationships In A Sea Of Rising Intermediaries

There is a fundamental division at the heart of the digital economy. Digital tools make it possible for any company to build a direct relationship with its customers. At the same time, new digital intermediaries can use the same digital tools to create unprecedented intermediary roles. Torn between two lovers, anyone?

We’re in the age of the customer, a period during which end consumers have more access to the basic economic resources that help them make more rational and empowered decisions. The theory of perfect competition dictates that market economies flourish best on a foundation of perfect information that enables perfectly rational actors. The digital technologies we all carry in our pockets — not to mention, have surrounding us in our cars, our homes, and even strapped to our bodies — have initiated a chain reaction, unleashing an unprecedented level of information, which has enabled us — if we choose to accept our mission — to behave like much more rational actors than ever before. (Caveat lector, I didn’t say “perfectly rational” for a reason. See our research on how humans make choices to understand more.) 

The more those technologies spread, the more buyers and sellers enter the system, the more innovation there is — at lower cost, thanks to the economics of digital disruption – and the spiral feeds itself.

Read more

Will People Really Do That? Hyperadoption Says Yes

Two weeks ago, I stood on Forrester's mainstage at its 2015 Forum For Marketing Leaders in New York (see a few minutes of the speech below). There, I told an audience of hundreds of our clients about hyperadoption, a term that I'm amazed no one has coined before now. Get used to the word. Because it's the characteristic that will define the next 10 years of your personal and business experience. In fact, in our first report debuting the concept of hyperadoption — released the same day I stood on the stage — I claim that hyperadoption will cause the next 10 years to generate an order of magnitude more change in your life than the past 10 years did.

That's an audacious claim. Because the past 10 years gave you the smartphone and the tablet. But I mean it, and over the coming year, I intend to prove it in my research. 

Forrester clients can read the report, which synthesizes much of the work I've done over two decades, where I've had a front-row seat to the changes in how consumers adopt, such as the first consumer experiences in what was then known as the World Wide Web, including that very rare behavior known as online shopping. That experience, combined with the neuroscience research I've followed since my own days in the lab, has convinced me that the economics of digital disruption now allow people to bypass the ancient, loss-avoiding algorithms running in our heads that used to make us cautious of new things and now no longer do.

As I write in the report: 

Read more

Change The Interface, Change The World: Microsoft HoloLens Takes Computing’s Next Giant Leap

Windows 10 comes with holographic computing built into it. And to prove that it’s serious about holography, the company announced Microsoft HoloLens, a headset that lets people interact with holograms in the real world.

Wait, what?

I know what you’re thinking. Microsoft has a credibility problem when it comes to launches of future tech. Remember that this is the company that tried to launch touch-based tablet computing in 2000. Microsoft launched a smartwatch years before anybody else that also came to naught. I’ll spare you a longer list of Microsoft’s mislaunches. It all adds up to a fair bit of earned skepticism. Surely Microsoft can’t be expected to create the computing interface that will do to graphical user interfaces what the mouse did to the text-based user interface.

Read more

Sony Should Fight Fire With Fire And Post "The Interview" Online For Free

Late last night, Sony revealed that it would pull The Interview from its release schedule. This decision was made in response to the step taken by the major theater chains, all agreeing that they would not screen the movie on its release day. The unprecedented decision is causing consternation among entertainment media types who feel that Sony has put the right of free speech in jeopardy. That's a conversation worth having, and I'm glad it's happening. But there is an entirely new question that this situation brings into dramatic relief, one that didn't exist before and one that our premeditations won't help us resolve. The question is this:

Can companies participate in cyber war?

Up until now, companies have prepared to defend themselves against cyber attacks as one-off nuisances. Such attacks are now so common that they no longer make the news. Even massive breaches where millions of customer data points are compromised tend to give us pause for only a few moments, perhaps a few days, and then we move on. But what Sony experienced was not just a security breach. This hack was a declaration of cyber war intended to bring Sony to its digital knees: a low-cost digitally effective cyber war that puts none of the hackers' assets in harm's way. And given yesterday's announcement, it appears to have worked.

Read more

Four Lessons On Wearables From LeWeb Paris 2014

I'm packing to leave Paris and it's a hard town to leave. Not only because I managed to catch a glimpse of a Paris sunset last night from the top of Notre Dame, but because I'm leaving LeWeb Paris 2014 while it's still in full swing. There's no denying LeWeb is one of the most invigorating events I've attended. Highlights in the first two days included a candid discussion on Uber with celebrity venture capitalist Fred Wilson and amazing comments from Web founder Tim Berners-Lee on everything from robots to net neutrality to Europe's "right to be forgottten" laws. Most invigorating for me personally was the day one session on wearables. LeWeb invited me to curate this hour-long track as part of its new format, tackling multiple themes in short bursts over several days. Curation required pulling together experts on the topic which was both simple and difficult. Simple because there are some great ones to choose from, difficult because I would have had 10 people on stage if I could have managed it. But that's where the hard task of curation comes in.

Read more

Digging For Interactive Entertainment With Minecraft

Yesterday Microsoft announced it would acquire Mojang along with its massive Minecraft gaming franchise for $2.5 billion. By now we've all seen the coverage, including the gratuitous interviews with middle-schoolers about whether Microsoft is "cool" enough to own Minecraft. By and large, we think this is a good acquisition for Microsoft, and we said as much in our Quick Take, just published this afternoon, summarizing the acquisition, its benefits, and its challenges for Forrester clients. Go to the report to read the client-only details of our analysis: "Quick Take: Microsoft Mines Minecraft for the Future of Interactive Entertainment." As we explain in the report, there are specific challenges Microsoft will face that will determine whether this ends up being a sensible acquisition or a sensational one. 

Beyond the detailed analysis of the report, it's worth exploring the long-term question of what that sensational outcome would look like. The difference turns on the question of whether Microsoft is ready to invest in the future of digital interactive entertainment. This is a subtle point that has been missed in most analysis of the acquisition. Most people insist on covering the purchase as a gaming industry event. Microsoft, the owner of the Xbox, buys Minecraft, a huge gaming franchise. But that low-level analysis misses a bigger picture that I sincerely hope Microsoft is actively aware of.

Read more

Has Apple Thrown In the Towel?

The rumors turn out to be true. Apple is buying Beats for $3 billion, just slightly less money than originally suggested. Now that it's official, I'm confidently reiterating my conviction that Apple cannot be spending this unprecedented sum on Beats for either its headphones or its subscription music business. Because while the company may be worth that much, it's not worth that much to Apple, the world's most innovative consumer electronics and consumer software maker. Because choosing to buy Beats purely for its existing businesses and revenues would represent Apple significantly lowering its sights, aiming to graduate right from innovative leader of life-changing technology to kinda-cool company that makes stuff teenagers like. Not that selling to teenagers isn't a good thing; it can certainly bring in money, but it doesn't typically generate long-lasting brand relationships.

To be clear, if Apple is buying Beats purely for its headphones or music subscription business, then Apple is making a mistake. However, there are those of us who still believe that Apple hasn't thrown in the towel. And why would it? There are still many consumer markets to dominate — entire markets like wearables and home automation tech and even in-car experiences, all of which are in their infancy — and Apple still has the smarts, the brand, and certainly the money to make a run for any of those things, if not all of them. So why would Apple instead sign up to become a holding company for fashionable but not life-changing brands?

Read more

Categories:

Having Faith In Apple Plus Beats

A media frenzy arose last night when the Financial Times suggested it had word from the inside that Apple is closing in on buying Beats Electronics for $3.2 billion. The immediate response from all quarters has been puzzlement and on multiple levels. As the sun rose today, so did the doubts about the impending deal. Generally, large strategic acquisitions — like when Google bought Nest for a similar figure — can be justified on the basis of buying something you don’t already have: a promising new technology, a large customer base, or entrée into a desirable industry. None of these things apply to this acquisition by Apple. Acquisitions at a more mature business stage can typically be justified purely on a revenue or margin basis or the desire to snap up a brand with more energy. Those don’t apply here, either. Even those who have tried to stretch the argument a bit have suggested that Apple could be buying Beats purely for a quick road into the music streaming business as a hedge against Spotify — except that Apple owns the music industry and doesn’t need Beats to build the music streaming offering that the company has denied for years that it should even consider getting into.

Read more