Microsoft officially launched Cortana Suites — a key part of Windows Azure Intelligent Cloud — in China last week, together with MySQL Database on Azure. Windows Azure Intelligent Cloud provides real-time analytics and open source database services to Chinese customers in nationwide data centers operated by 21Vianet.
To give Chinese customers a better idea of how to use cloud-based analytics, Windows Azure demonstrated customer usage scenarios involving big data analytics on cloud. The China Meteorological Administration partnered with AccuWeather, using Windows Azure to monitor and analyze air quality data from meteorological satellites and local air monitoring stations in real time.
Chinese manufacturers face challenges from digital service providers that better understand customers and shorten the distance from product design to the end user. After implementing real-time analytics on sensor data and customer behavior, manufacturers can improve their business models via:
Product innovation. Chinese manufacturers have started tracking operational data from sensors embedded in their products to manage and predict product upgrade and maintenance cycles. Customers prefer to pay for the time they actually use the equipment — so mechanical manufacturers use cloud analytics to support this sales model. The recent rash of elevator accidents in China primarily involved elevators whose manufacturers had limited labor resources for post-sales services — a common complaint of Chinese elevator manufacturers.
Ah, the good old days. The world used to be simple. ETL vendors provided data integration functionality, DBMS vendors data warehouse platforms and BI vendors concentrated on reporting, analysis and data visualization. And they all lived happily ever after without stepping on each others’ toes and benefiting from lucrative partnerships. Alas, the modern world of BI and data integration is infinitely more complex with multiple, often overlapping offerings from data integration and BI vendors. I see the following three major segments in the market of preparing data for BI:
Fully functional and highly scalable ETL platforms that are used for integrating analytical data as well as moving, synchronizing and replicating operational, transactional data. This is still the realm of tech professionals who use ETL products from Informatica, AbInitio, IBM, Oracle, Microsoft and others.
An emerging market of data preparation technologies that specialize mostly in integrating data for BI use cases and mostly run by business users. Notable vendors in the space include Alteryx, Paxata, Trifecta, Datawatch, Birst, and a few others.
Data preparation features built right into BI platforms. Most leading BI vendors today provide such capabilities to a varying degree.
You’ve heard it before but we said it again – this time in our recent webinar. There's a new kid in town: the chief data officer. Why the new role? Because of an increasing awareness of the value of data and the painful recognition of an inability to take advantage of the opportunities that it provides — due to technology, business, or basic cultural barriers. That was the topic of our webinar presented to a full house a few days ago; we discussed our recent report, Top Performers Appoint Chief Data Officers. Fortunately for those who weren’t there, the presentation – Chief Data Officers Cross The Chasm – is available (to clients) for download.
As the title suggests, chief data officers are no longer just for the early adopters – those enthusiasts and visionaries on the forefront of new technology trends. With 45% of global companies having appointed a chief data officer (not to be confused with a chief digital officer, as we specifically asked about “data”) and another 16% planning to make an appointment in the next 12 months – according to Forrester's Business Technographics surveys, the role of the chief data officer really has move into the mainstream.
However, there remain many companies who are not sure of whether they need a CDO or not. Many of those in our audience fell into that category. We asked two questions of the audience to gauge their interest and their actions to improve their data maturity:
Are you making organizational changes specifically to improve your data capabilities?
In chaos theory, the butterfly effect posits that seemingly small changes at one moment in time can result in large, dramatic changes at another. The subtle flap of a butterfly’s wing can trigger a violent hurricane that occurs miles away or days later. Rationally, the idea may seem like a stretch, but in a digital sense, we are witnesses to – and victims of – the butterfly effect every day through social media. A few individuals’ posts online can escalate into a chorus of voices that mobilizes communities and creates new standards. We saw this last year after a homeless man in Boston turned in a backpack and, more recently, when Cecil the lion was killed in Zimbabwe.
Social media has always been a catalyst for bringing people together as well as an outlet where consumers can vent. But when a surge of voices results in change, social media posts are more than ephemeral cybertext. And, according to Forrester’s Consumer Technographics® data, consumers around the world leverage social media to generate buzz about current events, although members of some countries are more vocal than others:
In the past three decades, management information systems, data integration, data warehouses (DWs), BI, and other relevant technologies and processes only scratched the surface of turning data into useful information and actionable insights:
Organizations leverage less than half of their structured data for insights. The latest Forrester data and analytics survey finds that organizations use on average only 40% of their structured data for strategic decision-making.
Unstructured data remains largely untapped. Organizations are even less mature in their use of unstructured data. They tap only about a third of their unstructured data sources (28% of semistructured and 31% of unstructured) for strategic decision-making. And these percentages don’t include more recent components of a 360-degree view of the customer, such as voice of the customer (VoC), social media, and the Internet of Things.
BI architectures continue to become more complex. The intricacies of earlier-generation and many current business intelligence (BI) architectural stacks, which usually require the integration of dozens of components from different vendors, are just one reason it takes so long and costs so much to deliver a single version of the truth with a seamlessly integrated, centralized enterprise BI environment.
Existing BI architectures are not flexible enough. Most organizations take too long to get to the ultimate goal of a centralized BI environment, and by the time they think they are done, there are new data sources, new regulations, and new customer needs, which all require more changes to the BI environment.
Hello from the newest analyst serving Forrester Research’s CIO role. My name is Paul Miller, and I joined Forrester at the beginning of August. I am attached to Forrester’s London office, but it’s already clear that I’ll be working with clients across many time zones.
As my Analyst bio describes, my primary focus is on cloud computing, with a particular interest in the way that cloud-based approaches enable (or even require) organizations to embrace digital transformation of themselves and their customer relationships. Before joining Forrester, I spent six years as an independent analyst and consultant. My work spanned cloud computing and big data and I am sure that this broader portfolio of interests will continue into my Forrester research, particularly where I can explore the demonstrable value that these approaches bring to those who embrace them.
I am still working on the best way to capture and explain my research coverage, talking with many of my new colleagues, and learning about potential synergies between what they already do and what I could or should be doing. I know that the first document to appear with my name on it will be a CIO-friendly look at OpenStack, as the genesis of this new Brief lies in a report that I had to write as part of Forrester’s recruitment process. I have a long (long, long) list of further reports I am keen to get started on, and these should begin to appear online as upcoming titles in the very near future. I shall also be blogging here, and look forward to using this as a way to get shorter thoughts and perspectives online relatively quickly. I’ve been regularly blogging for work since early 2004, although too many of the blogs I used to write for are now only preserved in the vaults of Brewster Kahle’s wonderful Internet Archive.
The explosion of data and fast-changing customer needs have led many companies to a realization: They must constantly improve their capabilities, competencies, and culture in order to turn data into business value. But how do Business Intelligence (BI) professionals know whether they must modernize their platforms or whether their main challenges are mostly about culture, people, and processes?
"Our BI environment is only used for reporting — we need big data for analytics."
"Our data warehouse takes very long to build and update — we were told we can replace it with Hadoop."
These are just some of the conversations that Forrester clients initiate, believing they require a big data solution. But after a few probing questions, companies realize that they may need to upgrade their outdated BI platform, switch to a different database architecture, add extra nodes to their data warehouse (DW) servers, improve their data quality and data governance processes, or other commonsense solutions to their challenges, where new big data technologies may be one of the options, but not the only one, and sometimes not the best. Rather than incorrectly assuming that big data is the panacea for all issues associated with poorly architected and deployed BI environments, BI pros should follow the guidelines in the Forrester recent report to decide whether their BI environment needs a healthy dose of upgrades and process improvements or whether it requires different big data technologies. Here are some of the findings and recommendations from the full research report:
Even though Business Intelligence applications have been out there for decades lots of people still struggle with “how do I get started with BI”. I constantly deal with clients who mistakenly start their BI journey by selecting a BI platform or not thinking about the data architecture. I know it’s a HUGE oversimplification but in a nutshell here’s a simple roadmap (for a more complete roadmap please see the Roadmap document in Forrester BI Playbook) that will ensure that your BI strategy is aligned with your business strategy and you will hit the road running. The best way to start, IMHO, is from the performance management point of view:
Catalog your organization business units and departments
For each business unit /department ask questions about their business strategy and objectives
Then ask about what goals do they set for themselves in order achieve the objectives
Next ask what metrics and indicators do they use to track where they are against their goals and objectives. Good rule of thumb: no business area, department needs to track more than 20 to 30 metrics. More than that is unmanageable.
Then ask questions how they would like to slice/dice these metrics (by time period, by region, by business unit, by customer segment, etc)
Business intelligence has gone through multiple iterations in the past few decades. While BI's evolution has addressed some of the technology and process shortcomings of the earlier management information systems, BI teams still face challenges. Enterprises are transforming only 40% of their structured data and 31% of their unstructured data into information and insights. In addition, 63% of organizations still use spreadsheet-based applications for more than half of their decisions. Many earlier and current enterprise BI deployments:
Have hit the limits of scalability.
Struggle to address rapid changes in customer and regulatory requirements.
Fail to break through waterfall's design limitations.
Suffer from mismatched business and technology priorities and languages.
When I think about data, I can't help but think about hockey. As a passionate hockey mom, it's hard to separate my conversations about data all week with clients from the practices and games I sit through, screaming encouragement to my son and his team (sometimes to the embarrassment of my husband!). So when I recently saw a documentary on the building of the Russian hockey team that our miracle US hockey team beat at the 1980 Olympics, the story of Anatoli Tarsov stuck with me.
Before the 1960s, Russia didn't have a hockey team. Then the Communist party determined that it was critical that Russia build one — and compete on the world stage. They selected Anatoli Tarsov to build the team and coach. He couldn't see films on hockey. He couldn't watch teams play. There was no reference on how to play the game. And yet, he built a world-class hockey club that not only beat the great Nordic teams but went on to crush the Canadian teams that were the standard for hockey excellence.
This is a lesson for us all when it comes to data. Do we stick with our standards and recipes from Inmon and Kimball? Do we follow check-box assessments from CMMI, DM-BOK, or TOGAF's information architecture framework? Do we rely on governance compliance to police our data?
Or do we break the rules and create our own that are based on outcomes and results? This might be the scarier path. This might be the riskier path. But do you want data to be where your business needs it, or do you want to predefine, constrain, and bias the insight?