The new data economy isn’t about data; it is about insights. How can I increase the availability of my locomotive fleet? How can I extend the longevity of my new tires? How can I improve my on-time-in-full rate? Which subscribers are most likely to churn in the near future? Where is the best location to build a new restaurant franchise or open a new retail outlet? Business decision-makers want answers to these kinds of questions, and new insights services providers are eager to help them.
A growing number of companies recognize the opportunity their data provides, and they take that data to market: 1/3 of firms report commercializing data or sharing it for revenue with partners or customers. The recently published Forrester Report Top Performers Commercialize Data Through Insights Services discusses the new trends in data commercialization: who is buying, who is selling, and what offerings are available, from direct data sales to the delivery of data-derived insight services.
While some commercializers avail themselves of data markets such as Dawex or DataStreamX, many are creating more sophisticated data-derived products and services. They are becoming insights services providers, often as an incremental offering to their existing customers. Some offer insights based on smart products and IoT analytics. Siemens Mobility, Boeing, and GM offer predictive maintenance for their planes, trains, and automobiles. In the agricultural products industry, companies such as Monsanto and DuPont offer services that prescribe when and what farmers should plant, when certain interventions, such as water or pesticide applications, are advisable, or when to harvest.
The explosive growth of the data economy is being fueled by the rise of insights services. Companies have been selling and sharing data for years. Axciom and Experian made their name by providing access to rich troves of consumer data. Thompson Reuters, Dun and Bradstreet and Bloomberg distributed financial and corporate data. Data brokers of various kinds connected buyers and sellers across a rich data market landscape. Customers, however, needed to be able to manage and manipulate the data to derive value from it. That required a requisite set of tools and technologies and a high degree of data expertise. Without that data savvy, insights could be elusive.
The new data market is different with insights services providers doing the heavy lifting, delivering relevant and actionable insights directly into decision-making processes. These insights services providers come in a number of flavors. Some provide insights relevant to a particular vertical; others focus on a particular domain such as risk mitigation or function within an organizations such as sales, marketing, or operations.
Recently, the largest annual get together of the mobile industry, Mobile World Congress (MWC) took place in Barcelona. In my opinion, the biggest themes at MWC in 2017 that are relevant for enterprise customers were the internet of things (IoT), artificial intelligence (AI), platforms, collaboration, and connectivity. These themes underline how mobility is becoming part of the broader digital transformation initiative. I discuss this shift in this separate blog and report. MWC provided several valuable insights for business and technology leaders to align their mobile to their digital strategies:
-> Not everything that claims to be AI is true AI. Many vendors that claimed during MWC to be AI-proficient are in fact able to deliver true machine-learning solutions to generate transformative customer and operational insights. Most solutions that were branded as AI at MWC rely on preprogrammed responses and statistics rather than machine learning.
Since the term BI is often used to also include data management processes and technologies, let's assume that in your case you are only looking for expertise required to build reports and dashboards and it does not include
Data integration (ETL, etc) expertise
Data governance (master data management, data quality, etc) expertise
Data modelling (relational and multidimensional) expertise
Enterprises agree that speedy deployment of big data Hadoop platforms has been critical to their success, especially as use cases expand and proliferate. However, deploying Hadoop systems is often difficult, especially when supporting complex workloads and dealing with hundreds of terabytes or petabytes of data. Architects need a considerable amount of time and effort to install, tune, and optimize Hadoop. Hadoop-optimized systems (aka appliances) make on-premises deployments virtually instant and blazing fast to boot. Unlike generic hardware infrastructure, Hadoop-optimized systems are preconfigured and integrated hardware and software components to deliver optimal performance and support various big data workloads. They also support one or many of the major distros such as Cloudera, Hortonworks, IBM BigInsights, and MapR. As a result, organizations spend less time installing, tuning, troubleshooting, patching, upgrading, and dealing with integration- and scale-related issues.
Choose From Among 8 Hadoop-Optimized Systems Vendors
Customers’ perception of a company depends on their experiences with the organization at every point of contact. Companies can try to change how customers view a brand in a number of ways, such as a new mobile app or an improved complaint-handling process. However, to really improve customer perception, every interaction at every touchpoint must answer questions, suggest new services, and deepen the relationship. Many firms fail to tap into business opportunities that their front-line employees encounter because their processes and technology are antiquated.
Enterprise architecture (EA) programs can lead the effort to address these limitations and deliver benefits to customers. British Gas, one of the winners of the 2015 Forrester/InfoWorld EA Awards, is a firm that seized its opportunity. My recent report, Enterprise Architects Transform Customer Engagement, analyzes the key practices enterprise architects at British Gas made to serve as brand ambassadors and to improve customer satisfaction levels and highlights key lessons for EA leaders. These practices include:
APIs, cloud, and big data technologies power the new engagement platform. To build an engagement platform that delivers customer insights to front-line engineers, the British Gas EA team developed a platform architecture that uses APIs and cloud and big data technologies to support a new engagement platform and the applications on top of it. The API mechanism simplifies digital connections to business applications; cloud infrastructure provides robustness and agility for business operations; big data technology arms field engineers with customer insights; and policies and multitenancy ensure flexibility and security.
Open source big data technologies like Hadoop have done much to begin the transformation of analytics. We're moving from expensive and specialist analytics teams towards an environment in which processes, workflows, and decision-making throughout an organisation can - in theory at least - become usefully data-driven. Established providers of analytics, BI and data warehouse technologies liberally sprinkle Hadoop, Spark and other cool project names throughout their products, delivering real advantages and real cost-savings, as well as grabbing some of the Hadoop glow for themselves. Startups, often closely associated with shepherding one of the newer open source projects, also compete for mindshare and custom.
And the opportunity is big. Hortonworks, for example, has described the global big data market as a $50 billion opportunity. But that pales into insignificance next to what Hortonworks (again) describes as a $1.7 trillion opportunity. Other companies and analysts have their own numbers, which do differ, but the step-change is clear and significant. Hadoop, and the vendors gravitating to that community, mostly address 'data at rest'; data that has already been collected from some process or interaction or query. The bigger opportunity relates to 'data in motion,' and to the internet of things that will be responsible for generating so much of this.
Huawei Technologies started out nearly 30 years ago as a small private company with 14 employees and 140,000 yuan in capital. By 2015, its total revenue exceeded $60 billion. Huawei is already a global company, but its globalization journey has been a difficult one since the very beginning. Despite its continuous business growth in other regions, Huawei has faced critical censorship in the US since Day One — and last week the US government put Huawei under the microscope yet again.
National security is important, but using “national security” as an excuse for allowing unfair competition will only harm customers. It’s time for the governments of both countries to trust each other more. I’ve recently published a report focusing on Huawei’s continuous progress toward becoming a key enabler of digital transformation in the telco and enterprise spaces. Some of the key takeaways:
Huawei has holistic strategies for digital transformation. Huawei’s broad vision of digital strategy — which focuses on cloud enablement and readiness, partner enablement, and open source co-creation — has helped the firm sustain strong business growth in the telco and enterprise markets. For example, its partnerships with T-Systems on the Open Telekom Cloud in Germany and with Telefónica on public cloud in the Americas have helped carriers in local markets give cloud users on-demand, all-online, self-service experiences.
The Background – Linux as a Fast Follower and the Need for Hot Patching
No doubt about it, Linux has made impressive strides in the last 15 years, gaining many features previously associated with high-end proprietary Unix as it made the transition from small system plaything to core enterprise processing resource and the engine of the extended web as we know it. Along the way it gained reliable and highly scalable schedulers, a multiplicity of efficient and scalable file systems, advanced RAS features, its own embedded virtualization and efficient thread support.
As Linux grew, so did supporting hardware, particularly the capabilities of the ubiquitous x86 CPU upon which the vast majority of Linux runs today. But the debate has always been about how close Linux could get to "the real OS", the core proprietary Unix variants that for two decades defined the limits of non-mainframe scalability and reliability. But "the times they are a changing", and the new narrative may be "when will Unix catch up to Linux on critical RAS features like hot patching".
Hot patching, the ability to apply updates to the OS kernel while it is running, is a long sought-after but elusive feature of a production OS. Long sought after because both developers and operations teams recognize that bringing down an OS instance that is doing critical high-volume work is at best disruptive and worst a logistical nightmare, and elusive because it is incredibly difficult. There have been several failed attempts, and several implementations that "almost worked" but were so fraught with exceptions that they were not really useful in production.[i]