AI Is The Sincerest Form of Flattery ... And Fear

Pure AI is true intelligence that can mimic or exceed the intelligence of human beings. It is still a long way off, if it can even ever be achieved. But what if AI became pure — could perceive, think, act, and even replicate as we do? Look to humanity for the answer. Humanity has been both beautiful and brutal:

  • The beauty of ingenuity, survival, exploration, art, and kindness.
  • The brutality of crime, war, and pettiness.
Read more

Micro Explanations For Nine Essential AI Technologies

Artificial Intelligence is rampant in the movie Ex MachinaArtificial Intelligence (AI) is not one big, specific technology. Rather, it is comprised of one or more building block technologies. So, to understand AI, you have to understand each of these nine building block technologies. Now, you could argue that there are more technologies than the ones listed here, but any additional technology can fit under one of these building blocks. This is a follow-on to my post Artificial Intelligence: Fact, Fiction, How Enterprises Can Crush It

Here are the nine pragmatic AI technology building blocks that enterprises can leverage now:

■        Knowledge engineering. Knowledge engineering is a process to understand and then represent human knowledge in data structures, semantic models, and heuristics (rules). AD&D pros can embed this engineered knowledge in applications to solve complex problems that are generally associated with human expertise. For example, large insurers have used knowledge engineering to represent and embed the expertise of claims adjusters to automate the adjudication process. IBM Watson Health uses engineered knowledge in combination with a corpus of information that includes over 290 medical journals, textbooks, and drug databases to help oncologists choose the best treatment for their patients.

Read more

Artificial Intelligence: Fact, Fiction. How Enterprises Can Crush It

Forrester surveyed business and technology professionals and found that 58% of them are researching AI, but only 12% are using AI systems. This gap reflects growing interest in AI, but little actual use at this time. We expect enterprise interest in, and use of, AI to increase as software vendors roll out AI platforms and build AI capabilities into applications. Enterprises that plan to invest in AI expect to improve customer experiences, improve products and services, and disrupt their industry with new business models.

But the burning question is: how can your enterprise use AI today to crush it? To answer this question we first must bring clarity to the nebulous definition of AI.Let’s break it down further:

■        “Artificial” is the opposite of organic. Artificial simply means person-made versus occurring naturally in the universe. Computer scientists, engineers, and developers research, design, and create a combination of software, computers, and machine to manifest AI technology.

■        “Intelligence” is in the eye of the beholder.  Philosophers will have job security for a very long time trying to define intelligence precisely. That’s because, intelligence is much tougher to define because we humans routinely assign intelligence to all matter of things including well-trained dachshunds, self-driving cars, and “intelligent” assistants such as Amazon Echo. Intelligence is relative. For AI purists, intelligence is more akin to human abilities. It means the ability to perceive its environment, take actions that satisfy a set of goals, and learn from both successes and failures. Intelligence among humans varies greatly and so too does it vary among AI systems.

Temper Your Expectations, But Don’t Give Up On AI

Read more

On-Premise Hadoop Just Got Easier With These 8 Hadoop-Optimized Systems

Enterprises agree that speedy deployment of big data Hadoop platforms has been critical to their success, especially as use cases expand and proliferate. However, deploying Hadoop systems is often difficult, especially when supporting complex workloads and dealing with hundreds of terabytes or petabytes of data. Architects need a considerable amount of time and effort to install, tune, and optimize Hadoop. Hadoop-optimized systems (aka appliances) make on-premises deployments virtually instant and blazing fast to boot. Unlike generic hardware infrastructure, Hadoop-optimized systems are preconfigured and integrated hardware and software components to deliver optimal performance and support various big data workloads. They also support one or many of the major distros such as Cloudera, Hortonworks, IBM BigInsights, and MapR.  As a result, organizations spend less time installing, tuning, troubleshooting, patching, upgrading, and dealing with integration- and scale-related issues.

Choose From Among 8 Hadoop-Optimized Systems Vendors

Noel Yuhanna and me published Forrester Wave: Big Data Hadoop-Optimized Systems, Q2 2016  where we evaluated 7 of the 8 options in the market. HP Enterprise's solution was not evaluated in this Wave, but Forrester also considers HPE a key player in the market for Hadoop-Optimized Systems along with the 7 vendors we did evaluate in the Wave. 

Read more

15 "True" Streaming Analytics Platforms For Real-Time Everything

Streaming Analytics Captures Real-Time Intelligence

Streaming AnalyticsMost enterprises aren't fully exploiting real-time streaming data that flows from IoT devices and mobile, web, and enterprise apps. Streaming analytics is essential for real-time insights and bringing real-time context to apps. Don't dismiss streaming analytics as a form of "traditional analytics" use for postmortem analysis. Far from it —  streaming analytics analyzes data right now, when it can be analyzed and put to good use to make applications of all kinds (including IoT) contextual and smarter. Forrester defines streaming analytics as:

Software that can filter, aggregate, enrich, and analyze a high throughput of data from multiple, disparate live data sources and in any data format to identify simple and complex patterns to provide applications with context to detect opportune situations, automate immediate actions, and dynamically adapt.

Forrester Wave: Big Data Streaming Analytics, Q1 2016

To help enterprises understand what commercial and open source options are available, Rowan Curran and I evaluated 15 streaming analytics vendors using Forrester's Wave methodology. Forrester clients can read the full report to understand the market category and see the detailed criteria, scores, and ranking of the vendors. Here is a summary of the 15 vendors solutions we evaluated listed in alphabetical order:

Read more

Hadoop Is Data's Darling For A Reason

Hadoop thoroughly disrupts the economics of data, analytics, and data-driven applications. That's cool because the unfortunate truth has been that the potential of most data lies dormant. On average, between 60% and 73% of all data within an enterprise goes unused for analytics. That's unacceptable in an age where deeper, actionable insights, especially about customers, are a competitive necessity. Enterprises are responding by adopting what Forrester calls "Hadoop and friends" (friends such as Spark and Kafka and others). Get Hadoop, but choose the distribution that is right for your enterprise.

Solid Choices All Around Make For Tough Choices

Forrester's evaluated five key Hadoop distributions from vendors: Cloudera, Hortonworks, IBM, MapR Technologies, and Pivotal Software. Forrester's evaluation of big data Hadoop distributions uncovered a market with four Leaders and one Strong Performer:

  • Cloudera, MapR Technologies, IBM, and Hortonworks are Leaders. Enterprise Hadoop is a market that is not even 10 years old, but Forrester estimates that 100% of all large enterprises will adopt it (Hadoop and related technologies such as Spark) for big data analytics within the next two years. The stakes are exceedingly high for the pure-play distribution vendors Cloudera, Hortonworks, and MapR Technologies, which have all of their eggs in the Hadoop basket. Currently, there is no absolute winner in the market; each of the vendors focuses on key features such as security, scale, integration, governance, and performance critical for enterprise adoption.

Read more

The Predictive Modeling Process Using Machine Learning

Predictive analytics uses statistical and machine learning algorithms to find aptterns in data that might predict similar outcomes in the future. Check out this less than 3 minute, fun and fruity video to understand the six steps of predictive modeling.  For tools that use machine learning to build predictive models, Forrester clients can read The Forrester Wave: Big Data Predictive Analytics Solutions, Q2 2015 and A Machine Learning Primer For BT Professionals.

Apache Spark's Marriage To Hadoop Will Be Bigger Than Kim And Kanye

  • Apache Spark is an open source cluster computing platform designed to process big data as efficiently as possible. Sound familiar? That's what Hadoop is designed to do. However, these are distinctly different, but complementary, platforms. Hadoop is designed to process large volumes of data that lives in an Hadoop distributed file system (HDFS). Spark is also designed to process large volumes of data, but much more efficiently than MapReduce, in part, by caching data in-memory. But, to say that Spark is just an in-memory data processing platform is a gross oversimplification and a common misconception. It also has a unique development framework that simplifies the development and efficiency of data processing jobs. You'll often hear Hadoop and Spark mentioned in the same breath. That's because, although they are independent platforms in their own right, they have an evolving, symbiotic relationship. Application development and delivery professionals (AD&D) must understand the key differences and synergies between this next-generation cluster-computing power couple to make informed decisions about their big data strategy and investments. Forrester clients can read the full report explaining the difference and synergies here: Apache Spark Is Powerful And Promising
Read more

Forrester’s Hadoop Predictions 2015

Hadoop adoption and innovation is moving forward at a fast pace, playing a critical role in today's data economy. But, how fast and far will Hadoop go heading into 2015? 
 
Prediction 1: Hadooponomics makes enterprise adoption mandatory. The jury is in. Hadoop has been found not guilty of being an over-hyped open source platform. Hadoop has proven real enterprise value in any number of use cases including data lakes, traditional and advanced analytics, ETL-less ETL, active-archive, and even some transactional applications. All these use cases are powered by what Forrester calls “Hadooponomics” — its ability to linearly scale both data storage and data processing.
 
What it means: The remaining minority of dazed and confused CIOs will make Hadoop a priority for 2015.
 
Predictions 2 and 3: Forrester clients can read the full text of all 8 Hadoop Predictions.
 
Read more

What Qualities Do Great Enterprise Application Developers Possess?

What are you doing on October 16th and 17th? That's when Forrester's Forum for Application Development & Delivery Professionals will be held in Chicago. Join us this year for lively session, networking, and discussions about building software that powers your business. The agenda is hot including a session from me on The Unstoppable Momentum Of Hadoop and guest speaker from McDonald's on How McDonald's Plans To Leverage Its New Digital Platform To Revolutionaize Customer Experiences.

We have lots of fun at these events too. Check out this video of last year's event where we grabbed both clients and analysts and asked them an important, and to some, philosophical question: What Makes A Great Application Developer? See if you'd answer the same way.