Contributed by Bryan Wang, Di Jin, and Vanessa Zeng
JD.com, the second largest online retailer in China, went public on May 22, listing itself on Nasdaq after merely 11 years of existence. At the time of IPO, JD had a market value of nearly $30 billion. Despite its size however, JD still managed to increase its customer base by 62% in 2013. How did JD manage to continually achieve business growth? I believe this is due to three key factors that differentiate JD:
■ Comprehensive logistics network for online retail in China. JD.com invested heavily in a last-mile strategy to ensure that customers receive products as quickly as possible, establishing 82 local warehouses with 1,620 delivery and 214 pickup stations across nearly 500 cities in China. This has made same-day delivery available in 43 cities — far ahead of the capabilities of Google Shopping Express in San Francisco. To better reach customers in lower-tier cities, JD is also collaborating with local convenience store chains in provinces like Shanxi and Guangdong to further strengthen its last-mile delivery capability.
Forrester data shows that valuing a customer's time is the most important factor in good customer service. Customers simply want an accurate, relevant, and complete answer to their question upon first contact, so they can get back to what they were doing before the issue arose. Here are the numbers:
Consumers have little tolerance for long or difficult service interactions. 55% of US online adults are likely to abandon their online purchase if they can't find a quick answer to their question. In addition, 77% say that valuing their time is the most important thing a company can do to provide them with good online customer service, up 6 points from 2012.
Older customers are as intolerant to friction in service interactions as the young. Impatience is not only a characteristic of the young. Older Boomers are not tolerant to long customer service interactions. Meeting these high expectations for the older generation can pay off. US online seniors may be less likely than their younger counterparts to purchase online, but don't underestimate their online commerce activity: 71% of US online consumers ages 69 and older have made an online retail purchase in the past three months.
Recent news of a a computer program that passed the Turing Test is a great achievement for artificial intelligence (AI). Pulling down the barrier between human and machine has been a decades long holy grail pursuit. Right now, it is a novelty. In the near future, the implications are immense.
Which brings us to why should you care.
Earlier this week the House majority leader, Eric Cantor, suffered an enormous defeat in Virginia's Republican primary by Tea Party candidate David Brat. No one predicted this - the polls were wrong, by a long shot. Frank Luntz, a Republican pollster and communication advisor, offered up his opinion on what was missing in a New York Times Op-Ed piece - lack of face-to-face discussions and interviews with voters. He asserts that while data collection was limited to discrete survey questions, what it lacked was context. Information such as voter mood, perceptions, motives, and overall mind set were missing. Even if you collected quantitative data across a variety of sources, you don't get to these prescient indicators.
The new wave of AI (the next 2 - 5 years) makes capturing this insight possible and at scale. Marketing organizations are already using such capabilities to test advertising messages and positioning in focus group settings. But, if you took this a step further and allowed pollsters to ingest full discussions in person or through transcripts in research interviews, street polls, social media, news discussions and interviews, and other sources where citizen points of view manifest directly and indirectly to voting, that rich content translates into more accurate and insightful information.
On June 10, Salesforce.com announced Salesforce Wear, a bundle of free tools and reference applications aimed at evangelizing the power of enterprise wearables. The offering supports six different wearable devices, each with its own open-source reference application to help developers design and build wearable apps that connect to the Salesforce1 platform.
Salesforce Wear has the potential to turbo-charge the growing market for enterprise wearables. Enterprises using Salesforce Wear will gain tools and reference applications that immediately apply to six wearable devices: three smart watches (Pebble, Samsung Gear, and Android Wear), plus Google Glass, the Myo armband, and Bionym’s Nymi authentication device.
Some of the reference applications are pure enterprise/B2B workforce enablement applications, like the Google Glass application for oil rigs, which can be generalized to other field service scenarios (and which, conceptually, I have written about before). Salesforce Wear’s app facilitates real-time field actions by providing schematics of the equipment being serviced, offering a view into the full service history of the equipment, and connecting field workers to colleagues for real-time collaboration. All in all, the reference app helps field workers fix problems more quickly and effectively.
Salesforce Wear's Casino Reference Application with the Bionym Nymi Band. Source: Salesforce
Our Forum For Technology Management Leaders in London starts tomorrow and I'm very excited about the program that we have been able to put together across the two days. On day one, we will be hearing from Jeroen Tas, Chief Executive Officer, Informatics Solutions and Services, Philips Healthcare, about how he and his team have evolved IT to become a fundamental enabler of growth for Philips as a real-time, connected company. Jeroen has over 30 years of global experience as an entrepreneur and senior executive in the financial services, healthcare, and information technology industries. Before taking on his current position, Jeroen was the Group Chief Information Officer of Royal Philips, leading IT worldwide.
In the run-up to the Forum, I asked Jeroen to answer a number of questions on Philips Healthcare's digital business journey. Jeroen's answers are a must-read for healthcare- and other technology management leaders about to embark on the same journey, and provide great insight into the challenges of making connected healthcare a reality. I look forward to hearing Jeroen speak on the main stage tomorrow!
Q: You have been a driving force behind Philips Healthcare’s strategy to create a connected healthcare world. Can you explain your approach?
On June 9, Docker.com announced that it will release version 1.0 of Docker, an open source platform that could automate the deployment of various types of applications as lightweight, portable, self-sufficient containers and run them virtually on any infrastructure. This announcement indicates that the platform is ready for commercial use, including lightweight, portable runtime support and packaging via Docker Engine and cloud services for application sharing and process automation via Docker Hub.
We talked to some early adopters of Docker, including global ISVs and local solution providers. We believe that Docker-based solutions will disrupt the server virtualization market segment and further drive the adoption of cloud because of their:
Technology advantages. Today’s componentized applications often rely on other components, applications, or services. For instance, your Ruby on Rails applications might rely on MongoDB as a persistence layer while using nginx as a web server. Each component might also have its own set of dependencies, which could conflict with each other. Docker can easily package the necessary dependencies and separate them within their own containers.
Earlier this spring I was determined to tell the responsive web site management/operation story as a linkage between RWD’s business metrics and operational/site performance metrics and improvement tactics. Instead, I found a fragmented story: The business teams have different processes, tools, and goals from technical teams, whereby ‘management’ happens in isolation from ‘operation.’ Business teams that need to prove the ROI of RWD simply did not have a direct linkage to site performance, operations, and monitoring efforts. Compounding the problem, many front-end development agencies that build responsive sites don’t focus on metrics because they aren’t contracted for managed services after the site goes live. As a result, responsive site owners/committees must find their own fix, and our recent research is designed to address both RWD’s performance operations (i.e., speed) issues and business-value analysis for responsive sites:
When the first Linux distributions based on the 3.0 kernel were released almost a year ago, I was struck by how far Linux had advanced. The latest turn of the crank for Linux, in the form of Red Hat Enterprise Linux 7 (RHEL 7), reinforces this opinion. Built primarily on recent versions of the Linux 3.0 et seq kernel available to the entire Linux community, including SUSE, Red Hat, Cannonical and others, RHEL 7 continues the progress of the Linux community toward an OS that is fully capable of replacing proprietary RISC/UNIX for the vast majority of enterprise workloads. It is apparent, both from the details on RHEL 7 and from perusing the documentation on other distribution providers, that Linux has continued to mature nicely as both a foundation for large scale-out clouds as well as a strong contender for the kind of enterprise workloads that previously were only comfortable on either RISC/UNIX systems or large Microsoft Server systems. In effect, Linux has continued its maturation to the point where its feature set and scalability begin to look like and feel like a top-tier UNIX.
In addition to the required low-level plumbing – schedulers, memory management and file systems capable of keeping up with both high-volume transactions and operating effectively in large distributed clusters – Red Hat has also focused on features to improve the installation and management experience, thus directly reducing cost of ownership, following in the footsteps of other modern OS development trajectories.
Among the enterprise technology that caught my eye:
Business needs and requirements demand expertise and coordination for privacy programs and practices. As a result, chief privacy officers, data protection officers, and other designated privacy professionals like privacy analysts are a fast growing presence within the enterprise today. The International Association of Privacy Professionals (IAPP) is 16,000 members strong today (compared to 7,500 back in 2010) and growing!
In many organizations, a dedicated privacy professional (e.g., a full-time employee who focuses on privacy and not someone who has privacy responsibilities attached to another role) is a new role. Privacy professionals come from a variety of backgrounds from legal to IT, and the details of their role and focus can vary depending on the organization and the size of the privacy team. Yet they all have one thing in common: they must work together with multiple privacy stakeholders – IT, security, legal, HR, marketing, and more! – across the enterprise. And honestly, it’s not always easy. Like any relationship, there are ups and downs.
At the Cisco Live Event 2014 in San Francisco last week, we heard about plenty of updates, extensions, and new acquisitions to expand the business. The major technologies highlighted were InterCloud, Application Centric Infrastructure (ACI), and the Internet of Everything (IoE). Among these new offerings, I reveal that Cisco’s extended big data and analytics capabilities excited me the most. Why? Because its data virtualization techniques can help customers easily analyze large volumes of virtual data, no matter where it physically resides; enhanced video analytics technology could improve the customer experience when checking out in retail stores or waiting for a train; while IoE analytics and digital intelligence increase customer engagement.
Data virtualization supports big data analytics. End user organizations realize the importance of quickly and carefully making decisions; to do this, they plan to centralize data from different branch offices or departments. Consolidating data that resides in multiple systems and in global locations — or that is locked away in spreadsheets — is expensive. For example, telecom operators in China have hundreds of millions subscribers and need to consolidate and analyze this customer data — but it resides in 31 provincial companies. Data consolidation will be a huge and expensive project, but data virtualization technology can help solve this problem. Customers could consider adding Cisco to their data virtualization vendor shortlist, especially given Cisco’s acquisition of Composite Software last July.