Open Data And Trust Play An Important Role In Emerging Digital Ecosystems

Dan Bieler

Open data is critical for delivering contextual value to customers in digital ecosystems. For instance, The Weather Channel and OpenWeatherMap collect weather-related data points from millions of data sources, including the wingtips of aircraft. They could share these data points with car insurance companies. This would allow the insurers to expand their customer journey activities, such as alerting their customers in real time to warn them of an approaching hailstorm so that the car owners have a chance to move their cars to safety. Success requires making logical connections between isolated data fields to generate meaningful business intelligence.

But also trust is critical to deliver value in digital ecosystems. One of the key questions for big data is who owns the data. Is it the division that collects the data, the business as a whole, or the customer whose data is collected? Forrester believes that for data analytics to unfold its true potential and gain end user acceptance, the users themselves must remain the ultimate owner of their own data.

The development of control mechanisms that allow end users to control their data is a major task for CIOs. One possible approach could be dashboard portals that allow end users to specify which businesses can use which data sets and for what purpose. Private.me is trying to develop such a mechanism. It provides servers to which individual's information is distributed to be run by non-profit organizations. Data anonymization is another approach that many businesses are working on, despite the fact that there are limits to data anonymization as a means to ensure true privacy.

Read more

Is Big Data Enough? (Ramping Up For Strata In San Jose)

Brian  Hopkins

I’m ramping up to attend Strata in San Jose, February 18, 19 and 20th. Here is some info to help everyone who wants to connect and share thoughts. Looking forward to great sessions and a lot of thought leadership.

I’ll be setting aside some time for 1:1 meetings (Booked Full)

[Updated on 2/17] - I have set up some blocks of time to meet with people at Strata. Please follow the link below to schedule with me on a first come basis.

meetme.so/BHopkins_1on1_20Mins

[Update] - I booked out inside 2 hours...didn't expect that! I may open up my calendar for more meetings but need to get a better bead on the sessions I want to attend first. Shoot to catch me at breakfast, will tweet out when I'm there.

I’ll be posting my thoughts and locations on Twitter

The best way to connect with me at Strata is to follow me on Twitter @practicingea

You can post @ me or DM me. I’ll be posting my location and you can drop by for ad hoc conversations as well.

I’m very interested in your point of view - data driven to insights driven

I am concluding very quickly that “big data” as we have viewed it for the last five years is not enough. I see firms using words like “real-time” or “right-time” or “fast data” to suggest the need is much bigger than big data – its about connecting data to action in a continuous learning loop.

Read more

Categories:

Build An Agile Business Intelligence (BI) Organization

Boris Evelson
The battle of trying to apply traditional waterfall software development life-cycle (SDLC) methodology and project management to BI has already been fought — and largely lost. These approaches and best practices, which apply to most other enterprise applications, work well in some cases, as with very well-defined and stable BI capabilities like tax or regulatory reporting. Mission-critical, enterprise-grade BI apps can also have a reasonably long shelf life of a year or more. But these best practices do not work for the majority (anecdotally, about three-quarters) of BI initiatives, where requirements change much faster than these traditional approaches can support; by the time a traditional BI application development team rolls out what it thought was a well-designed BI application, it's too late. As a result, BI pros need to move beyond earlier-generation BI support organizations to:
 
  • Focus on business outcomes, not just technologies. Earlier-generation BI programs lacked an "outcomes first" mentality. Those programs employed bottom-up approaches that focused on the project management and technology first, leaving clients without the proper outcomes that they needed to manage the business; in other words, they created an insights-to-action gap.  BI pros  should use a top-down approach that defines key performance indicators, metrics, and measures that support the business' goals and objectives. They must resist the temptation to address technology and data needs before the business requirements. 
Read more

Rethinking Analytics Infrastructure

Richard Fichera

Last year I published a reasonably well-received research document on Hadoop infrastructure, “Building the Foundations for Customer Insight: Hadoop Infrastructure Architecture”. Now, less than a year later it’s looking obsolete, not so much because it was wrong for traditional (and yes, it does seem funny to use a word like “traditional” to describe a technology that itself is still rapidly evolving and only in mainstream use for a handful of years) Hadoop, but because the universe of analytics technology and tools has been evolving at light-speed.

If your analytics are anchored by Hadoop and its underlying map reduce processing, then the mainstream architecture described in the document, that of clusters of servers each with their own compute and storage, may still be appropriate. On the other hand, if, like many enterprises, you are adding additional analysis tools such as NoSQL databases, SQL on Hadoop (Impala, Stinger, Vertica) and particularly Spark, an in-memory-based analytics technology that is well suited for real-time and streaming data, it may be necessary to begin reassessing the supporting infrastructure in order to build something that can continue to support Hadoop as well as cater to the differing access patterns of other tools sets. This need to rethink the underlying analytics plumbing was brought home by a recent demonstration by HP of a reference architecture for analytics, publicly referred to as the HP Big Data Reference Architecture.

Read more

Time To Reset Your Knowledge Of Big Data Ecosystems In China

Charlie Dai

At the China Hadoop Summit 2015 in Beijing this past weekend, I talked with various big data players, including large consumers of big data China Unicom, Baidu.com, JD.com, and Ctrip.com; Hadoop platform solution providers Hortonworks, RedHadoop, BeagleData, and Transwarp; infrastructure software vendors like Sequotia.com; and Agile BI software vendors like Yonghong Tech.

The summit was well-attended — organizers planned for 1,000 attendees and double that number attended — and from the presentations and conversations it’s clear that big data ecosystems are making substantial progress. Here are some of my key takeaways:

  • Telcos are focusing on optimizing internal operations with big data.Take China Unicom, one of China’s three major telcos, for example. China Unicom has completed a comprehensive business scenario analysis of related data across each segment of internal business operations, including business and operations support systems, Internet data centers, and networks (fixed, mobile, and broadband). It has built a Hadoop-based big data platform to process trillions of mobile access records every day within the mobile network to provide practical guidelines and progress monitoring on the construction of base stations.
Read more

Get Ready For BI Change

Boris Evelson

To compete in today's global economy, businesses and governments need agility and the ability to adapt quickly to change. And what about internal adoption to roll out enterprise-grade Business Intelligence (BI) applications? BI change is ongoing; often, many things change concurrently. One element that too often takes a back seat is the impact of changes on the organization's people. Prosci, an independent research company focused on organizational change management (OCM), has developed benchmarks that propose five areas in which change management needs to do better. They all involve the people side of change: better engage the sponsor; begin organizational change management early in the change process; get employees engaged in change activities; secure sufficient personnel resources; and better communicate with employees. Because BI is not a single application — and often not even a single platform — we recommend adding a sixth area: visibility into BI usage and performance management of BI itself, aka BI on BI. Forrester recommends keeping these six areas top of mind as your organization prepares for any kind of change.

Some strategic business events, like mergers, are high-risk initiatives involving major changes over two or more years; others, such as restructuring, must be implemented in six months. In the case of BI, some changes might need to happen within a few weeks or even days. All changes will lead to either achieving or failing to achieve a business. There are seven major categories of business and organizational change:

  1. People acquisitions
  2. Technology acquisitions
  3. Business process changes
  4. New technology implementations
  5. Organizational transformations
  6. Leadership changes
Read more

Bare Metal Clouds – Performance and Isolation Drive Consideration

Richard Fichera

I’ve been talking to a number of users and providers of bare-metal cloud services, and am finding the common threads among the high-profile use cases both interesting individually and starting to connect some dots in terms of common use cases for these service providers who provide the ability to provision and use dedicated physical servers with very similar semantics to the common VM IaaS cloud – servers that can be instantiated at will in the cloud, provisioned with a variety of OS images, be connected to storage and run applications. The differentiation for the customers is in behavior of the resulting images:

  • Deterministic performance – Your workload is running on a dedicated resource, so there is no question of any “noisy neighbor” problem, or even of sharing resources with otherwise well-behaved neighbors.
  • Extreme low latency – Like it or not, VMs, even lightweight ones, impose some level of additional latency compared to bare-metal OS images. Where this latency is a factor, bare-metal clouds offer a differentiated alternative.
  • Raw performance – Under the right conditions, a single bare-metal server can process more work than a collection of VMs, even when their nominal aggregate performance is similar. Benchmarking is always tricky, but several of the bare metal cloud vendors can show some impressive comparative benchmarks to prospective customers.
Read more

How The CMO And CIO Will Determine The Future Of Business In 2015

Cliff Condon
Forrester has just published 45 sets of 2015 predictions for every role we write about, from customer insights to application development to security and risk. In my role as Chief Research Officer, one thing is now clear to me: the two roles that matter most for 2015 are the CIO and the CMO (see our infographic below) -- their relationship and joint strategy to boost the business will determine the future of any corporation.
 
CMOs historically focused narrowly on marketing and promotion. That’s not enough in the age of the customer. The CMO of 2015 must own the most important driver of business success -- the customer experience -- and represent the customer’s perspective in corporate strategy. Andy Childs at Paychex is a great example -- he owns not only traditional marketing but strategic planning and M&A.
 
Read more

Shifting Sands – Changing Alliances Underscore the Dynamism of the Infrastructure Systems Market

Richard Fichera

There is always a tendency to regard the major players in large markets as being a static background against which the froth of smaller companies and the rapid dance of customer innovation plays out. But if we turn our lens toward the major server vendors (who are now also storage and networking as well as software vendors), we see that the relatively flat industry revenues hide almost continuous churn. Turn back the clock slightly more than five years ago, and the market was dominated by three vendors, HP, Dell and IBM. In slightly more than five years, IBM has divested itself of highest velocity portion of its server business, Dell is no longer a public company, Lenovo is now a major player in servers, Cisco has come out of nowhere to mount a serious challenge in the x86 server segment, and HP has announced that it intends to split itself into two companies.

And it hasn’t stopped. Two recent events, the fracturing of the VCE consortium and the formerly unthinkable hook-up of IBM and Cisco illustrate the urgency with which existing players are seeking differential advantage, and reinforce our contention that the whole segment of converged and integrated infrastructure remains one of the active and profitable segments of the industry.

EMC’s recent acquisition of Cisco’s interest in VCE effectively acknowledged what most customers have been telling us for a long time – that VCE had become essentially an EMC-driven sales vehicle to sell storage, supported by VMware (owned by EMC) and Cisco as a systems platform. EMC’s purchase of Cisco’s interest also tacitly acknowledges two underlying tensions in the converged infrastructure space:

Read more

Elephants, Pigs, Rhinos and Giraphs; Oh My! – It's Time To Get A Handle On Hadoop

Brian  Hopkins

By now you have at least seen the cute little elephant logo or you may have spent serious time with the basic components of Hadoop like HDFS, MapReduce, Hive, Pig and most recently YARN. But do you have a handle on Kafka, Rhino, Sentry, Impala, Oozie, Spark, Storm, Tez… Giraph? Do you need a Zookeeper? Apache has one of those too! For example, the latest version of Hortonworks Data Platform has over 20 Apache packages and reflects the chaos of the open source ecosystem. Cloudera, MapR, Pivotal, Microsoft and IBM all have their own products and open source additions while supporting various combinations of the Apache projects.

After hearing the confusion between Spark and Hadoop one too many times, I was inspired to write a report, The Hadoop Ecosystem Overview, Q4 2104. For those that have day jobs that don’t include constantly tracking Hadoop evolution, I dove in and worked with Hadoop vendors and trusted consultants to create a framework. We divided the complex Hadoop ecosystem into a core set of tools that all work closely with data stored in Hadoop File System and extended group of components that leverage but do not require it.

In the past, enterprise architects could afford to think big picture and that meant treating Hadoop as a single package of tools. Not any more – you need to understand the details to keep up in the age of the customer. Use our framework to help, but please read the report if you can as I include a lot more there.

Read more