New Announcements Foreshadow Fundamental Changes in Server and Storage Architectures

Richard Fichera

My colleague Henry Baltazar and I have been watching the development of new systems and storage technology for years now, and each of us has been trumpeting in our own way the future potential of new non-volatile memory technology (NVM) to not only provide a major leap for current flash-based storage technology but to trigger a major transformation in how servers and storage are architected and deployed and eventually in how software looks at persistent versus nonpersistent storage.

All well and good, but up until very recently we were limited to vague prognostications about which flavor of NVM would finally belly up to the bar for mass production, and how the resultant systems could be architected. In the last 30 days, two major technology developments, Intel’s further disclosure of its future joint-venture NVM technology, now known as 3D XPoint™ Technology, and Diablo Technologies introduction of Memory1, have allowed us to sharpen the focus on the potential outcomes and routes to market for this next wave of infrastructure transformation.

Intel/Micron Technology 3D XPoint Technology

Read more

You May Not Need A CDO - But Wouldn’t You Want To Improve Your Odds Of Success

Jennifer Belissent, Ph.D.

Gene Leganza and I just published a report on the role of the Chief Data Officer that we’re hearing so much about these days – Top Performers Appoint Chief Data Officers.  To introduce the report, we sat down with our press team at Forrester to talk about the findings, and the implications for our clients.

Forrester PR: There's a ton of fantastic data in the report around the CDO. If you had to call out the most surprising finding, what would top your list?

Gene:  No question it's the high correlation between high-performing companies and those with CDOs. Jennifer and I both feel that strong data capabilities are critical for organizations today and that the data agenda is quite complex and in need of strong leadership. That all means that it's quite logical to expect a correlation between strong data leadership and company performance - but given the relative newness of the CDO role it was surprising to see firm performance so closely linked to the role.

Of course, you can't infer cause and effect from correlation – the data could mean that execs in high-performing companies think having a CDO role is a good idea as much as it could mean CDOs are materially contributing to high performance. Either way that single statistic should make one take a serious look at the role in organizations without clear data leadership. 

Read more

Make Your BI Environment More Agile With BI on Hadoop

Boris Evelson
In the past three decades, management information systems, data integration, data warehouses (DWs), BI, and other relevant technologies and processes only scratched the surface of turning data into useful information and actionable insights:
  • Organizations leverage less than half of their structured data for insights. The latest Forrester data and analytics survey finds that organizations use on average only 40% of their structured data for strategic decision-making. 
  • Unstructured data remains largely untapped. Organizations are even less mature in their use of unstructured data. They tap only about a third of their unstructured data sources (28% of semistructured and 31% of unstructured) for strategic decision-making. And these percentages don’t include more recent components of a 360-degree view of the customer, such as voice of the customer (VoC), social media, and the Internet of Things. 
  • BI architectures continue to become more complex. The intricacies of earlier-generation and many current business intelligence (BI) architectural stacks, which usually require the integration of dozens of components from different vendors, are just one reason it takes so long and costs so much to deliver a single version of the truth with a seamlessly integrated, centralized enterprise BI environment.
  • Existing BI architectures are not flexible enough. Most organizations take too long to get to the ultimate goal of a centralized BI environment, and by the time they think they are done, there are new data sources, new regulations, and new customer needs, which all require more changes to the BI environment.
Read more

Don't Throw Hadoop At Every BI Challenge

Boris Evelson

The explosion of data and fast-changing customer needs have led many companies to a realization: They must constantly improve their capabilities, competencies, and culture in order to turn data into business value. But how do Business Intelligence (BI) professionals know whether they must modernize their platforms or whether their main challenges are mostly about culture, people, and processes?

"Our BI environment is only used for reporting — we need big data for analytics."

"Our data warehouse takes very long to build and update — we were told we can replace it with Hadoop."

These are just some of the conversations that Forrester clients initiate, believing they require a big data solution. But after a few probing questions, companies realize that they may need to upgrade their outdated BI platform, switch to a different database architecture, add extra nodes to their data warehouse (DW) servers, improve their data quality and data governance processes, or other commonsense solutions to their challenges, where new big data technologies may be one of the options, but not the only one, and sometimes not the best. Rather than incorrectly assuming that big data is the panacea for all issues associated with poorly architected and deployed BI environments, BI pros should follow the guidelines in the Forrester recent report to decide whether their BI environment needs a healthy dose of upgrades and process improvements or whether it requires different big data technologies. Here are some of the findings and recommendations from the full research report:

1) Hadoop won't solve your cultural challenges

Read more

Hit the road running with a new BI initiative

Boris Evelson

Even though Business Intelligence applications have been out there for decades lots of people still struggle with “how do I get started with BI”. I constantly deal with clients who mistakenly start their BI journey by selecting a BI platform or not thinking about the data architecture. I know it’s a HUGE oversimplification but in a nutshell here’s a simple roadmap (for a more complete roadmap please see the Roadmap document in Forrester BI Playbook) that will ensure that your BI strategy is aligned with your business strategy and you will hit the road running. The best way to start, IMHO, is from the performance management point of view:

  1. Catalog your organization business units and departments
  2. For each business unit /department ask questions about their business strategy and objectives
  3. Then ask about what goals do they set for themselves in order achieve the objectives
  4. Next ask what metrics and indicators do they use to track where they are against their goals and objectives. Good rule of thumb: no business area, department needs to track more than 20 to 30 metrics. More than that is unmanageable.
  5. Then ask questions how they would like to slice/dice these metrics (by time period, by region, by business unit, by customer segment, etc)
Read more

Systems Of Insight: Next Generation Business Intelligence

Boris Evelson

Earlier Generation BI Needs A Tune Up

Business intelligence has gone through multiple iterations in the past few decades. While BI's evolution has addressed some of the technology and process shortcomings of the earlier management information systems, BI teams still face challenges. Enterprises are transforming only 40% of their structured data and 31% of their unstructured data into information and insights. In addition, 63% of organizations still use spreadsheet-based applications for more than half of their decisions. Many earlier and current enterprise BI deployments:

  • Have hit the limits of scalability.
  • Struggle to address rapid changes in customer and regulatory requirements.
  • Fail to break through waterfall's design limitations.
  • Suffer from mismatched business and technology priorities and languages.
Read more

Data Today Keeps The Doctor Away

Jennifer Belissent, Ph.D.

In scanning through my O’Reilly Data Newsletter today, I noticed A Healthy Dose of Data, an MIT Sloan case study on the data and analytics culture at Intermountain, a healthcare network that runs 22 hospitals and 185 clinics.  The study is definitely worth the read.  It reviews the history of data use at Intermountain, which began way before the “big data” craze of recent years.  In fact, it was back in the 1950s that one of the Intermountain cardiologists, Homer Warner, began to explore clinical data to understand why some heart patients experienced better outcomes than others.  He went on to become known as the “father of medical informatics – the use of computer programs to analyze patient data to determine treatment protocols,” and with colleagues designed and launched their first decision-support tool. 

The case study goes on to describe how Intermountain has cultivated a strong data and analytics culture. Over time – Rome was not built in a day, as they say – they established data maturity across the organization by investing in the capacity (new tools and technologies), developing the competencies (new skills and processes) and finally spreading the culture (awareness, understanding and best practices) of data and analytics. Their analytical approach brought results – fewer surgical infections, more effective use of antibiotics, less time in intensive care etc – contributing to lower costs, better medical outcomes, and overall patient satisfaction.

Read more

Expand Your Big Data Capabilities With Unstructured Text Analytics

Boris Evelson
Beware of insights! Real danger lurks behind the promise of big data to bring more data to more people faster, better, and cheaper: Insights are only as good as how people interpret the information presented to them. When looking at a stock chart, you can't even answer the simplest question — "Is the latest stock price move good or bad for my portfolio?" — without understanding the context: where you are in your investment journey and whether you're looking to buy or sell. While structured data can provide some context — like checkboxes indicating your income range, investment experience, investment objectives, and risk tolerance levels — unstructured data sources contain several orders of magnitude more context. An email exchange with a financial advisor indicating your experience with a particular investment vehicle, news articles about the market segment heavily represented in your portfolio, and social media posts about companies in which you've invested or plan to invest can all generate much broader and deeper context to better inform your decision to buy or sell. 
 
But defining the context by finding structures, patterns, and meaning in unstructured data is not a simple process. As a result, firms face a gap between data and insights; while they are awash in an abundance of customer and marketing data, they struggle to convert this data into the insights needed to win, serve, and retain customers. In general, Forrester has found that: 
 
  • The problem is not a lack of data. Most companies have access to plenty of customer feedback surveys, contact center records, mobile tracking data, loyalty program activities, and social media feeds — but, alas, it's not easily available to business leaders to help them make decisions. 
Read more

Need Better Customer Insights To Fuel Your Digital Strategy? Start By Working On Your Communication Skills

Michael Barnes

 

Retaining and delighting empowered customers requires continuous, technology-enabled innovation and improved customer insight (CI). The logic is simple in theory, but that doesn’t make it any easier to implement in practice.

In my recent report, entitled “Applying Customer Insight To Your Digital Strategy”, I highlight the top lessons learned from organizations in Asia Pacific (AP) that are successfully leveraging CI to fuel digital initiatives. It all starts by ensuring that data-driven decision-making is central to the digital strategy. With that in mind, I want to use this blog post to focus on two key lessons from the report:

 

Lesson One: Establish A Clear Mandate To Invest In Customer Analytics

Successful companies serve empowered customers in the way they want to be served, not the way the company wants to serve them. When building a mandate you should:

■  Expect natural tensions between various business stakeholders to arise. To secure buy-in from senior business decision-makers, start by illustrating the clear link between digital capabilities and data as a source of improved customer understanding. Identify measurable objectives and then link them to three to four scenarios that highlight where the biggest opportunities and risks exist. Continue to justify data-related investments by restating these scenarios at regular intervals.

Read more

Intel Announces Xeon SOC – Seriously Raising the Bar for AMD and ARM Competition

Richard Fichera

Intel has made no secret of its development of the Xeon D, an SOC product designed to take Xeon processing close to power levels and product niches currently occupied by its lower-power and lower performance Atom line, and where emerging competition from ARM is more viable.

The new Xeon D-1500 is clear evidence that Intel “gets it” as far as platforms for hyperscale computing and other throughput per Watt and density-sensitive workloads, both in the enterprise and in the cloud are concerned. The D1500 breaks new ground in several areas:

It is the first Xeon SOC, combining 4 or 8 Xeon cores with embedded I/O including SATA, PCIe and multiple 10 nd 1 Gb Ethernet ports.

(Source: Intel)

It is the first of Intel’s 14 nm server chips expected to be introduced this year. This expected process shrink will also deliver a further performance and performance per Watt across the entire line of entry through mid-range server parts this year.

Why is this significant?

With the D-1500, Intel effectively draws a very deep line in the sand for emerging ARM technology as well as for AMD. The D1500, with 20W – 45W power, delivers the lower end of Xeon performance at power and density levels previously associated with Atom, and close enough to what is expected from the newer generation of higher performance ARM chips to once again call into question the viability of ARM on a pure performance and efficiency basis. While ARM implementations with embedded accelerators such as DSPs may still be attractive in selected workloads, the availability of a mainstream x86 option at these power levels may blunt the pace of ARM design wins both for general-purpose servers as well as embedded designs, notably for storage systems.

Read more