Virtual reality and 360 videos continue to gain momentum, but one obstacle that could stop them in their tracks is the lack of an analytics standard.
Virtual reality or 360 video (synonymous for this post) deliver immersive experiences. If you have never consumed VR video, imagine standing inside a globe with content flowing all around you.
Video analytics can be robust, but 360 introduces new challenges. Instead of a “lean back” experience, viewers of VR video take an active role in deciding where to focus. This means that success can’t be defined by views alone. Application Development & Delivery professionals will either need to develop their own analytics scheme or partner with a third party firm.
In order to understand 360 analytics, we first need to understand the format. 360 video is captured by multiple cameras and stitched into a common resolution like 1920 by 1080 pixels. The flattened (or equirectangular) video allows you to see everything at once. In order to create an immersive VR experience, that flattened video is then wrapped around a sphere using special metadata. Viewers can focus on a sliver of the video at a single time.
If your team is looking to deploy VR video, Application Development and Delivery professionals need to figure out how to define success. A few options might include:
Do you consider yourself “data-driven”? If you’re like most business and technology leaders, you do. But the reality is that most businesses have only scratched the surface when it comes to transforming all of that data into insight that drives real business action. In our 2016 predictions report, my colleagues Brian Hopkins, Jennifer Belissent, PhD., and I predict what will happen in the hottest areas of big data, analytics, business intelligence, and systems of insight — and tell you what to do about it. Here’s a sneak at just a few highlights:
Chief data officers (CDOs) will gain power, prestige, and presence . . . for now. The trend toward appointing a CDO accelerated in 2015, and will continue in 2016. CI pros should take advantage of this. How? Extend customer insights beyond marketing to drive a culture of insights-to-execution across the organization.
Firms will try to come to terms with data science scarcity. Two-thirds of firms will have built predictive systems capability by mid-2016, but will struggle to find data science talent. Customer insights teams must increase analytic yield without waiting for hard-to-find data scientists. How? Some analytics platforms from vendors like AgilOne, Custora, and Origami Logic can empower business users without a rigorous statistical background.
Last year I published a reasonably well-received research document on Hadoop infrastructure, “Building the Foundations for Customer Insight: Hadoop Infrastructure Architecture”. Now, less than a year later it’s looking obsolete, not so much because it was wrong for traditional (and yes, it does seem funny to use a word like “traditional” to describe a technology that itself is still rapidly evolving and only in mainstream use for a handful of years) Hadoop, but because the universe of analytics technology and tools has been evolving at light-speed.
If your analytics are anchored by Hadoop and its underlying map reduce processing, then the mainstream architecture described in the document, that of clusters of servers each with their own compute and storage, may still be appropriate. On the other hand, if, like many enterprises, you are adding additional analysis tools such as NoSQL databases, SQL on Hadoop (Impala, Stinger, Vertica) and particularly Spark, an in-memory-based analytics technology that is well suited for real-time and streaming data, it may be necessary to begin reassessing the supporting infrastructure in order to build something that can continue to support Hadoop as well as cater to the differing access patterns of other tools sets. This need to rethink the underlying analytics plumbing was brought home by a recent demonstration by HP of a reference architecture for analytics, publicly referred to as the HP Big Data Reference Architecture.
At the Cisco Live Event 2014 in San Francisco last week, we heard about plenty of updates, extensions, and new acquisitions to expand the business. The major technologies highlighted were InterCloud, Application Centric Infrastructure (ACI), and the Internet of Everything (IoE). Among these new offerings, I reveal that Cisco’s extended big data and analytics capabilities excited me the most. Why? Because its data virtualization techniques can help customers easily analyze large volumes of virtual data, no matter where it physically resides; enhanced video analytics technology could improve the customer experience when checking out in retail stores or waiting for a train; while IoE analytics and digital intelligence increase customer engagement.
Data virtualization supports big data analytics. End user organizations realize the importance of quickly and carefully making decisions; to do this, they plan to centralize data from different branch offices or departments. Consolidating data that resides in multiple systems and in global locations — or that is locked away in spreadsheets — is expensive. For example, telecom operators in China have hundreds of millions subscribers and need to consolidate and analyze this customer data — but it resides in 31 provincial companies. Data consolidation will be a huge and expensive project, but data virtualization technology can help solve this problem. Customers could consider adding Cisco to their data virtualization vendor shortlist, especially given Cisco’s acquisition of Composite Software last July.