What are we going to call cars that drive themselves? The term "automobile" would be perfect, but that's already taken.
"Nemo" means nobody in Latin -- a car driven by no one would be called a "nemobile." And you could call them "nemos" for short.
By the way, that's nemo with a long "e" -- pronounced like the fictitious fish or the commander of Jules Verne's submarine the Nautilus.
Short, distinctive, meaningful, good nickname. Much better than "driverless car," or "self-driving car," or the inevitable flat acronym "SDC."
By the way, if you are doubting that nemos will be on the scene anytime soon, consider this, as noted by Erik Brynjolfsson and Andrew McAfee in their excellent book Race Against The Machine: In 2004, DARPA offered a prize for any autonomous vehicle that could navigate a 150-mile course in the Mojave Desert. The best performing nemo only travelled eight miles, and it took two hours to do that. But by 2010, Google's self-driving cars had logged 1,000 miles on US highways.
Now fast forward to the summer of 2015 -- Google reported that its cars had logged one million miles of autonomous driving. And in 2016, the Tesla S and X now offer easy-to-use and dependable autopilot -- a very credible early form of self-driving.
Since Mobile World Congress, where the reality on the show floor was often either virtual or augmented, I’ve been thinking quite a bit about the practical uses of AR and VR – particularly in government and a smart city context. It’s not just all fun and games, is it?
The example of changing a roller coaster experience with new settings delivered via VR glasses is really cool. Yes, you can imagine repeating the ride to experience catapulting through medieval battle, flying through a tropical jungle, or bobsledding down alpine slopes. But the practical side of us – or at least me – wants to know what else there is. And, fortunately, I have a colleague who has already been thinking of these things.
A few months ago, I had the pleasure of collaborating with JP Gownder on a presentation for Forrester clients in Geneva. I presented on the ways to derive value from data and opportunities to leverage new insights service providers – clearly something top of mind for many of our clients. But alas JP’s presentation was much cooler, providing examples of how to derive real value from new technologies including AR and VR. Since then I’ve being thinking about how the two are related. And, in fact, they are.
Smart watches are not a must-have device – yet. The novelty of the device – combined with early adopters eager to have the next great thing – has carried smart watches from an obscure idea to a well-known device, but neither critical mass nor mass market adoption. So what’s missing?
Smart watches or similar wearables will hit critical mass (20%) and then mass market adoption (> 50%) only once consumers adopt these five applications:
1. Notifications. Among consumers surveyed by Forrester, 40% are tired of pulling their phones out of their pockets or purses. Moreover, according to a study conducted by Mary Meeker from Kleiner Perkins, more than 60-70% of consumers’ mobile moments are simply a quick glance at their devices to get information they need to make a decision or take action. Notifications could range from a sports score to a reminder to pay a bill. Smartphones and apps are overkill for these interactions or mobile moments.
2. Payments. Mobile payment solutions from companies like Apple, Google, and Samsung, among others, are game-changing. The combination of near-field communication (NFC) and payments drove adoption of the current generation of smartphone upgrades. Mobile payments remove friction from the payment process both online and in-person. For example, I use my Apple Wallet so often that it took me six weeks to realize that my ATM card had expired.
CIO pushback is part of a typical growing pain of all business intelligence (BI) startups. It means your land and expand strategy is working. Once you start expanding beyond a single department CIOs will notice. As a general rule, the earlier the CIO is brought on board, the better. CIOs who feel left out are likely to raise more objections than those who are involved in the early stages. A number of BI vendors that started out with a strategy of purposely avoiding the CIO found over time that they had to change their strategies - ultimately, there’s no way round the CIO. Forrester has also noticed that the more a vendor gets the reputation of “going round” the CIO, the greater the resistance is from CIOs once they do get involved.
There is of course also the situation where the business side doesn’t want the CIO involved, sometimes for very good reason. That notwithstanding, if there’s a dependency on the CIO when it comes to sign-off, Forrester would strongly recommend encouraging the business to bring him/her to the table.
The two key aspects to bear in mind in this context are:
CIOs look for transparency. Have architecture diagrams to hand out, be prepared to explain your solution in as much technical detail as required, and have answers ready regarding the enterprise IT capabilities listed below.
I’ve been a part of several development organizations, and, for several of those teams, security was an afterthought to the development process. We’d secure databases and even implement field level encryption but we rarely had to consider many attack vectors as we were building internal apps for enterprises and the risks were there, but not as great.
Fast forward to the Mobile First world we live in and that lazy attitude is no longer acceptable. S&R teams have real concerns and actively work to protect their computing environments – both internal-facing and external-facing. Development teams work the other side of that and implement secure code as part of their daily activities (right?). With an appropriate level of trust between the two organizations, many use code scanning utilities to verify delivered code and hunt for vulnerabilities. There are many sources of vulnerabilities; it could come from code written by the company’s developers, code pasted in from Stack Overflow or even added through some third-party or open source library. In my experience, static code scanning tools are effective and can catch a lot of potential vulnerabilities but, from a developer behavior standpoint, what the ultimately do is simply teach developers how to get their code to pass the scans, not actually deliver more secure code.
Years ago, I worked at a large customer service vendor. Our CEO had tasked us to "eat our own dog food" - that is implement our own solutions for our customer service operations which comprised of 40 or so tier 1 and 2 customer service agents. With these marching orders, I put a group of consultants and business analysts together to get this done. And after several months, the project stalled; got restarted; stalled again; then finally died. We limped on with our old systems in place for many more years.
Why did this project fail? It was because of a mismatch between the complexities of the solution that we were trying to implement, and the company's business needs. The customer service company that I worked for made enterprise software solutions, suitable for large organizations, which was typically implemented in call centers of many hundreds, if not thousands of call center agents. These solutions offered robust case management, with very customizable workflows, queuing and routing rules. These solutions also offered complex knowledge management, email and chat engines that could support millions of interactions a month. Implementation tended to span many months, where professional services consultants dove into the business processes that agents followed, and then reproduced them in these enterprise solutions.
Yet these solutions - as powerful as there are - were too complex for our simple needs. There were no simple "out of the box" best practice process flows. There were no rapid deployment options to get a company up and running quickly. There were no simple ways of setting up FAQs or simple knowledge, or creating simple email and chat routing rules for a moderate volume of digital interactions. What we needed was a highly usable solution, with a quick time-to-value, which contained just the most common functions of the enterprise solution.
Digital intelligence (DI) is the practice of bringing together the big data that we have on our customers to analyze and generate insights in so as to deliver the best, optimal and/or the most relevant experiences during moments of their digital interaction. Firms that get it right have a major competitive advantage in the digital age of the customer (For more information on the digital intelligence approach, see the “Optimize Customer Experiences With Digital Intelligence ” Forrester report).
This hot topic is why I am excited to announce the publication of the brand new Forrester report entitled “TechRadar™: Digital Intelligence, Q2 2016 ”. In this report, I analyze and review the business success and growth of the 15 core technologies for digital data management, analytics, and experience optimization needed to deliver great digital intelligence capabilities.
Some of my findings include:
DI tech is really hot at the moment. Whether its technology to ingest, manage, and merge different customer data (e.g. tag management or data warehousing), or to generate digital insights (e.g. app analytics or spatial analytics), or that for optimizing digital interactions (e.g. online testing or behavioral targeting) we found all the core DI technologies are on a trajectory for delivering a moderate if not significant success.
Knowledge is power. And in a time where insights drive business differentiation, knowledge is also the origin of power. In our daily routines as consumers, search is probably the most common application we use to find knowledge, and it forms the basis of our personal systems of insight. But at long last, search in the enterprise is catching up. A new wave of search-based applications and search-driven experiences are now being delivered by companies who understand the need to empower their employees and customers with immediate, contextual knowledge in an easily-consumable format.
In our new research, Mike Gualtieri and I look at how the emerging landscape of cognitive search experiences are incorporating advanced analytics, natural language processing (NLP), and machine learning to enable organizations to see across wide arrays of enterprise data and stitch together insights hidden among them.
Today in the US, we are gearing up to celebrate Cinco de Mayo with lively music, ice-cold margaritas, colorful clothing — the works. But while many Americans use the day to revel in the trappings of Mexican culture, they often don’t realize that the holiday is actually met with little pomp and circumstance in Mexico itself.
Cinco de Mayo is one of many traditions that have been adopted — and appropriated — across country borders. But the holiday represents a larger concept that applies to people, too: As individuals relocate around the world, they spark cultural variations and build unique identities in their own right.
For example, Forrester’s Consumer Technographics® survey data shows that Mexican-born individuals who now live in the US develop distinct behaviors and attitudes: Not only do these longer-tenured US residents become more comfortable sharing sensitive data (like financial information) online, they also increasingly execute digital transactions:
It’s interesting to note that even though metropolitan Mexico and the US have similar mobile penetration rates, the device profile, technology attitudes, and digital behaviors that characterize Mexican consumers shift after they settle in the US.
Delivering broad access to data and analytics to a diverse base of users is an intimidating task, yet it is an essential foundation to becoming an insights-driven organization. To win and keep customers in an increasingly competitive world, firms need to take advantage of the huge swaths of data available and put it into the hands of more users. To do this, business intelligence (BI) pros must evolve disjointed and convoluted data and analytics practices into well-orchestrated systems of insight that deliver actionable information. But implementing digital insights is just the first step with these systems — and few hit the bull's eye the first time. Continuously learning from previous insights and their results makes future efforts more efficient and effective. This is a key capability for the next-generation BI, what Forrester calls systems of insight.
"It's 10 o'clock! Do you know if your insights support actual verifiable facts?" This is a real challenge, as measuring report and dashboard effectiveness today involves mostly discipline and processes, not technology. For example, if a data mining analysis predicted a certain number of fraudulent transactions, do you have the discipline and processes to go back and verify whether the prediction came true? Or if a metrics dashboard was flashing red, telling you that inventory levels were too low for the current business environment, and the signal caused you to order more widgets, do you verify if this was a good or a bad decision? Did you make or lose money on the extra inventory you ordered? Organizations are still struggling with this ultimate measure of BI effectiveness. Only 8% of Forrester clients report robust capabilities for such continuous improvement, and 39% report just a few basic capabilities.