The future is here, folks, and the gaming industry is the first to get us there. Today I leave E3, the gaming industry's biggest US convention. When all is said and done, roughly 45,000 people will have come through LA's convention center -- most of them as nerdy as you're imagining right now -- to play the newest games, demo the latest hardware, and collectively drool over hyper-realistic zombies, aliens, robots, and other baddies game designers have placed in our digital sights.
At this E3 we have witnessed more advances in living room technology than the cable, consumer electronics, or the computer industry (yes, that includes Apple) have managed to pull off in many years of trying. Let me summarize:
It's late, this is just a short note to let you know that today I saw the future and what I saw was so stunning I couldn't go to sleep without telling you about it first. The future is the new Xbox 360 that debuted at Microsoft's E3 press conference today -- not just the improved hardware which ships to stores today and costs the same as the previous hardware -- but Xbox Kinect. This is Microsoft's long-awaited full-body natural user interface (NUI) for the Xbox 360, previously codenamed Project Natal and now branded as Kinect.
Kinect is everything. Kinect is the future of everything. Kinect is to the next decade what the operating system was to the 1980s, what the mouse was to the 1990s, and what the Internet has been ever since. It is the thing that will change everything. Once we've all been Kinected, we will never go back. You'll shop, communicate, chill out, engage, and debate using technology that can see you, image you in three dimensions, and interact with you in ways that are cooler than the most far-out science fiction, yet completely natural.
I could explain it, I could try, but I won't. Instead, I'll just encourage you to watch the last hour or so of the press conference yourself (though if you follow my link, you may also want to watch the first few minutes just to catch the pre-game interview with Felicia Day -- isn't she adorably nerdy?).
It has only been a few weeks since Google announced it would create a brave, new world with its Google TV platform. In all the reactions and the commentary, I have been amazed at how little people understand what's really going on here. Let me summarize: Google TV is a bigger deal than you think. In fact, it is so big that I scrapped the blog post I drafted about it because only a full-length report (with supporting survey data) could adequately explain what Google TV has done and will do to the TV market. That report went live this week. Allow me to explain why the report was necessary.
Some have expressed surprise that Google would even care about TV in the first place. After all, Google takes nearly $7 billion dollars into its coffers each quarter from that little old search engine it sports, a run-rate of $27 billion a year. In fact, this has long been a problem Google faces -- its core business is so terribly profitable that it's hard to justify investing in its acquisitions and side projects which have zero hope of ever contributing meaningfully to the business (not unlike the problem at Microsoft where Windows 7 is Microsoft). So why would Google bother with the old TV in our living rooms?
Because TV matters in a way that nothing else does. Each year, the TV drives roughly $70 billion in advertising and an equal amount in cable and satellite fees, and another $25 billion in consumer electronics sales. Plus, viewers spend 4.5 hours a day with it -- which is, mind you, the equivalent of a full-time job in some socialist-leaning countries (I'll refrain from naming names).
Google's goal is to get into that marketplace, eventually appropriating a healthy chunk of the billions in advertising that flow to and through the TV today with such painful inefficiency.
It's the most common question I get in my travels: Will people ever pay for content again? See what I had to say about that in a recent interview below (as posted on Paidcontent.org)
Implied in the question is a belief in some yesteryear in which people did pay for content. But the good news is, they never have and never will. That's the good news? Yes, because once we stop imagining that people will someday pay for content again, we can focus on giving them what they will pay for -- access to content.
It's what people have always paid for and it's clearly what they pay for now. Look deeper into the past and you find that people did not underwrite all the pages of content in their daily newspapers. Yes, they paid for the newspaper, but that's just because the newspaper was the only way to get efficient access to that much news and information. Today, instead of paying for newspapers, they pay for high-speed data plans to their homes and on their mobile devices as well as subscriptions to content from Netflix and their cable companies, accounting for 77% of their monthly spend on content. And they will pay even more for that in the future as 4G becomes a reality.
But there's another big story behind this Flash fiasco that has successfully remained off the radar of most. It's the answer to this question: How do the media companies -- you know, those people who use Flash to put their premium content online everywhere from Wired.com to hulu.com -- feel about having their primary delivery tool cut off at the knees?
Answer: Media companies hope to complain all the way to the bank.
First, a bit of disclosure. I'm the one who went on record explaining that the lack of Flash is one of the reasons I am not buying an iPad. So I'm clearly not a fan of the anti-Flash rhetoric for selfish reasons: I want my Flash content wherever I am. But I've spent the last few weeks discussing the Apple-Adobe problem with major magazine publishers, newspaper publishers, and TV networks. Their responses are at first obvious, and then surprisingly shrewd.
The Hulu-will-charge-you-money rumor mill is churning once again and the blogosphere has lit up with preemptively angered Hulu viewers vowing that they will never darken Hulu’s digital door again. Some call it greed, others point to nefarious pressure from ailing broadcast and cable operations, while some decry the end of a freewheeling era. They are all wrong.
Hulu charging for content is a good thing. In fact, it’s a necessary next step to get us where we need to be. Let me explain.
This comes at an awkward time, to say the least. The site’s CEO, Jason Kilar, admitted just weeks ago that the free site is profitable, taking in more than $100 million last year and on a run-rate to more than double that this year. Blunting that momentum would be foolish. But letting it run absent the burden of helping to pay for the shows it profits from would also be irresponsible, and not in a Father-knows-best “charging for content builds character” kind of irresponsible, but in a more “not taking advantage of the opportunity to take Hulu to the next level in benefit of the consumer” kind of irresponsible.
Throughout, as various members of the press have mused about the death of Amazon's Kindle, I feel compelled to point out that, contrary to popular belief, Amazon is in a better position now than it was before the iPad. That's right, if Amazon comes out swinging, Round 2 will go to Amazon. Here’s why:
Just came off the stage at PaidContent 2010, a day-long summit here at The Times Center near Times Square, dedicated to the question of if/how/when people will pay for content. The timing is good -- as I wrote in January, The New York Times is planning to charge for content within a year or so, Hulu is considering a subscription model (not necessarily in place of but, I believe, in addition to its free service), and the eBook pricing dilemmas are causing sleepless nights.
I opened the conference with a brief assertion that fretting over whether people will pay for content is based on a mistaken assumption: that people have ever paid for content in the past. They actually haven't. Instead, people have paid for access to content. But in an analog world, access was gated by physical form factors like vinyl, newsprint, and movie theaters. As a result, the coincidence of form factor and content made us believe that people pay for content.
But people have never paid for content. Even when a daily newspaper was a necessity for the average home, the dime you paid a day (in the 70s) for a newspaper did not cover the print cost, much less the reporting. Instead, it was classified ads and auto dealers who footed most of the bill. And the hours we spent on TV and radio every day through the last half of the last century until the explosion of cable in the 90s, were all free. When cable finally asserted itself, people did not pay per show or even by channel (with the exception of premium movie channels). Instead, they paid for overall access.
It was a surprising weekend for those of us who had naively imagined that after crossing the River iPad, we might actually get some Elysian rest. But, alas, the fates conspired against us and handed us the curious case of Amazon vs. Macmillan. Or Macmillan vs. Amazon?
For those who actually took the weekend off, let me summarize what happened. John Sargeant, the CEO of Macmillan Books, gave Amazon a wee-bit of an ultimatum: switch from a wholesale sell-through model, where Amazon buys digital books at a fixed wholesale rate and then can choose to sell those books at whatever price it deems appropriate (even at a loss, as it does with $9.99 bestsellers), to an agency model, where Amazon agrees to sell at a price set by the publisher in exchange for a 30% agency fee. Sargeant explained to Amazon that if it did not agree to the switch, Macmillan Books would make its eBooks subject to significant "windowing" wherein new books are held back from the digital store for some period, say six months, while hardback books are sold in stores and possibly, digital copies are sold through the iPad at $14.99.
This is more detail than we usually know about a negotiation like this because of what happened next. Sargeant got off of a plane on Friday only to discover that Amazon had responded by pulling all Macmillan books from the Kindle store as well as from Amazon.com. He then decided to make it clear to the industry (and his authors) that this drastic action was Amazon's fault, in a paid advertisement in a special Sunday edition of Publishers Lunch.
I have a weakness. I like to think big. And when we heard so many juicy rumors about the Apple tablet device, now named the iPad, I knew that with Steve Jobs at the helm, I could afford to think big. So big did I think, that I suggested the iPad should take media consumption to the next level and create an entirely new category of device.
At first, Jobs appeared ready to confirm my suspicions. He said seductive things like, "Everybody uses a laptop and or a smartphone. The question has arisen lately. Is there room for a third category in the middle?" I was sitting on the edge of my seat, ready to hear Jobs demonstrate that new category of device. But he didn't.
Instead, what Apple debuted today was a very nice upgrade to the iPod Touch.
Don't get me wrong. I love the iPod Touch and I was this close to getting one for myself. Now that the iPad has arrived, I can finally get one, the new, big one. But it's not a new category of device. It doesn't really revolutionize the 5-6 hours of media we consume the way it could have. It doesn't even send Amazon's Kindle running to the hills for cover. In fact, the competitor likely to take the biggest hit from the arrival of the iPad is Apple, in the form of fewer iPod Touches sold and fewer MacBook Airs sold.