Get ready for AWS business intelligence (BI): it's real and it packs a punch!
Today’s BI market is like a perpetual motion machine — an unstoppable engine that never seems to run out of steam. Forrester currently tracks more than 50 BI vendors, and not a month goes by without a software vendor or startup with tangential BI capabilities trying to take advantage of the craze for BI, analytics, and big data. This month is no exception: On October 7, Amazon crashed the party by announcing QuickSight, a new BI and analytics data management platform. BI pros will need to pay close attention, because this new platform is inexpensive, highly scalable, and has the potential to disrupt the BI vendor landscape. QuickSight is based on AWS’s cloud infrastructure, so it shares AWS characteristics like elasticity, abstracted complexity, and a pay-per-use consumption model. Specifically, the new QuickSight platform provides
New ways to get terabytes of data into AWS
Automatic enrichment of AWS metadata for more effective BI
An in-memory accelerator (SPICE) to speed up big data analytics
An industrial grade data analysis and visualization platform (QuickSight), including mobile clients
Consumers (and B2B customers) are more and more empowered with mobile devices and cloud-based, all but unlimited access to information about products, services, and prices. Customer stickiness is increasingly difficult to achieve as they demand instant gratification for their ever changing tastes and requirements. Switching product and service providers is now just a matter of clicking a few keys on a mobile phone. Forrester calls this the age of the customer, which elevates business and technology priorities to achieve:
Business agility.Business agility often equals the ability to adopt, react, and succeed in the midst of an unending fountain of customer driven requirements. Agile organizations make decisions differently by embracing a new, more grass-roots-based management approach. Employees down in the trenches, in individual business units, are the ones who are in close touch with customer problems, market shifts, and process inefficiencies. These workers are often in the best position to understand challenges and opportunities and to make decisions to improve the business. It is only when responses to change come from these highly aware and empowered employees, that enterprises become agile, competitive, and successful.
Ah, the good old days. The world used to be simple. ETL vendors provided data integration functionality, DBMS vendors data warehouse platforms and BI vendors concentrated on reporting, analysis and data visualization. And they all lived happily ever after without stepping on each others’ toes and benefiting from lucrative partnerships. Alas, the modern world of BI and data integration is infinitely more complex with multiple, often overlapping offerings from data integration and BI vendors. I see the following three major segments in the market of preparing data for BI:
Fully functional and highly scalable ETL platforms that are used for integrating analytical data as well as moving, synchronizing and replicating operational, transactional data. This is still the realm of tech professionals who use ETL products from Informatica, AbInitio, IBM, Oracle, Microsoft and others.
An emerging market of data preparation technologies that specialize mostly in integrating data for BI use cases and mostly run by business users. Notable vendors in the space include Alteryx, Paxata, Trifecta, Datawatch, Birst, and a few others.
Data preparation features built right into BI platforms. Most leading BI vendors today provide such capabilities to a varying degree.
My colleague Henry Baltazar and I have been watching the development of new systems and storage technology for years now, and each of us has been trumpeting in our own way the future potential of new non-volatile memory technology (NVM) to not only provide a major leap for current flash-based storage technology but to trigger a major transformation in how servers and storage are architected and deployed and eventually in how software looks at persistent versus nonpersistent storage.
All well and good, but up until very recently we were limited to vague prognostications about which flavor of NVM would finally belly up to the bar for mass production, and how the resultant systems could be architected. In the last 30 days, two major technology developments, Intel’s further disclosure of its future joint-venture NVM technology, now known as 3D XPoint™ Technology, and Diablo Technologies introduction of Memory1, have allowed us to sharpen the focus on the potential outcomes and routes to market for this next wave of infrastructure transformation.
Gene Leganza and I just published a report on the role of the Chief Data Officer that we’re hearing so much about these days – Top Performers Appoint Chief Data Officers. To introduce the report, we sat down with our press team at Forrester to talk about the findings, and the implications for our clients.
Forrester PR: There's a ton of fantastic data in the report around the CDO. If you had to call out the most surprising finding, what would top your list?
Gene: No question it's the high correlation between high-performing companies and those with CDOs. Jennifer and I both feel that strong data capabilities are critical for organizations today and that the data agenda is quite complex and in need of strong leadership. That all means that it's quite logical to expect a correlation between strong data leadership and company performance - but given the relative newness of the CDO role it was surprising to see firm performance so closely linked to the role.
Of course, you can't infer cause and effect from correlation – the data could mean that execs in high-performing companies think having a CDO role is a good idea as much as it could mean CDOs are materially contributing to high performance. Either way that single statistic should make one take a serious look at the role in organizations without clear data leadership.
In the past three decades, management information systems, data integration, data warehouses (DWs), BI, and other relevant technologies and processes only scratched the surface of turning data into useful information and actionable insights:
Organizations leverage less than half of their structured data for insights. The latest Forrester data and analytics survey finds that organizations use on average only 40% of their structured data for strategic decision-making.
Unstructured data remains largely untapped. Organizations are even less mature in their use of unstructured data. They tap only about a third of their unstructured data sources (28% of semistructured and 31% of unstructured) for strategic decision-making. And these percentages don’t include more recent components of a 360-degree view of the customer, such as voice of the customer (VoC), social media, and the Internet of Things.
BI architectures continue to become more complex. The intricacies of earlier-generation and many current business intelligence (BI) architectural stacks, which usually require the integration of dozens of components from different vendors, are just one reason it takes so long and costs so much to deliver a single version of the truth with a seamlessly integrated, centralized enterprise BI environment.
Existing BI architectures are not flexible enough. Most organizations take too long to get to the ultimate goal of a centralized BI environment, and by the time they think they are done, there are new data sources, new regulations, and new customer needs, which all require more changes to the BI environment.
The explosion of data and fast-changing customer needs have led many companies to a realization: They must constantly improve their capabilities, competencies, and culture in order to turn data into business value. But how do Business Intelligence (BI) professionals know whether they must modernize their platforms or whether their main challenges are mostly about culture, people, and processes?
"Our BI environment is only used for reporting — we need big data for analytics."
"Our data warehouse takes very long to build and update — we were told we can replace it with Hadoop."
These are just some of the conversations that Forrester clients initiate, believing they require a big data solution. But after a few probing questions, companies realize that they may need to upgrade their outdated BI platform, switch to a different database architecture, add extra nodes to their data warehouse (DW) servers, improve their data quality and data governance processes, or other commonsense solutions to their challenges, where new big data technologies may be one of the options, but not the only one, and sometimes not the best. Rather than incorrectly assuming that big data is the panacea for all issues associated with poorly architected and deployed BI environments, BI pros should follow the guidelines in the Forrester recent report to decide whether their BI environment needs a healthy dose of upgrades and process improvements or whether it requires different big data technologies. Here are some of the findings and recommendations from the full research report:
Even though Business Intelligence applications have been out there for decades lots of people still struggle with “how do I get started with BI”. I constantly deal with clients who mistakenly start their BI journey by selecting a BI platform or not thinking about the data architecture. I know it’s a HUGE oversimplification but in a nutshell here’s a simple roadmap (for a more complete roadmap please see the Roadmap document in Forrester BI Playbook) that will ensure that your BI strategy is aligned with your business strategy and you will hit the road running. The best way to start, IMHO, is from the performance management point of view:
Catalog your organization business units and departments
For each business unit /department ask questions about their business strategy and objectives
Then ask about what goals do they set for themselves in order achieve the objectives
Next ask what metrics and indicators do they use to track where they are against their goals and objectives. Good rule of thumb: no business area, department needs to track more than 20 to 30 metrics. More than that is unmanageable.
Then ask questions how they would like to slice/dice these metrics (by time period, by region, by business unit, by customer segment, etc)
Business intelligence has gone through multiple iterations in the past few decades. While BI's evolution has addressed some of the technology and process shortcomings of the earlier management information systems, BI teams still face challenges. Enterprises are transforming only 40% of their structured data and 31% of their unstructured data into information and insights. In addition, 63% of organizations still use spreadsheet-based applications for more than half of their decisions. Many earlier and current enterprise BI deployments:
Have hit the limits of scalability.
Struggle to address rapid changes in customer and regulatory requirements.
Fail to break through waterfall's design limitations.
Suffer from mismatched business and technology priorities and languages.
In scanning through my O’Reilly Data Newsletter today, I noticed A Healthy Dose of Data, an MIT Sloan case study on the data and analytics culture at Intermountain, a healthcare network that runs 22 hospitals and 185 clinics. The study is definitely worth the read. It reviews the history of data use at Intermountain, which began way before the “big data” craze of recent years. In fact, it was back in the 1950s that one of the Intermountain cardiologists, Homer Warner, began to explore clinical data to understand why some heart patients experienced better outcomes than others. He went on to become known as the “father of medical informatics – the use of computer programs to analyze patient data to determine treatment protocols,” and with colleagues designed and launched their first decision-support tool.
The case study goes on to describe how Intermountain has cultivated a strong data and analytics culture. Over time – Rome was not built in a day, as they say – they established data maturity across the organization by investing in the capacity (new tools and technologies), developing the competencies (new skills and processes) and finally spreading the culture (awareness, understanding and best practices) of data and analytics. Their analytical approach brought results – fewer surgical infections, more effective use of antibiotics, less time in intensive care etc – contributing to lower costs, better medical outcomes, and overall patient satisfaction.