There is always a tendency to regard the major players in large markets as being a static background against which the froth of smaller companies and the rapid dance of customer innovation plays out. But if we turn our lens toward the major server vendors (who are now also storage and networking as well as software vendors), we see that the relatively flat industry revenues hide almost continuous churn. Turn back the clock slightly more than five years ago, and the market was dominated by three vendors, HP, Dell and IBM. In slightly more than five years, IBM has divested itself of highest velocity portion of its server business, Dell is no longer a public company, Lenovo is now a major player in servers, Cisco has come out of nowhere to mount a serious challenge in the x86 server segment, and HP has announced that it intends to split itself into two companies.
And it hasn’t stopped. Two recent events, the fracturing of the VCE consortium and the formerly unthinkable hook-up of IBM and Cisco illustrate the urgency with which existing players are seeking differential advantage, and reinforce our contention that the whole segment of converged and integrated infrastructure remains one of the active and profitable segments of the industry.
EMC’s recent acquisition of Cisco’s interest in VCE effectively acknowledged what most customers have been telling us for a long time – that VCE had become essentially an EMC-driven sales vehicle to sell storage, supported by VMware (owned by EMC) and Cisco as a systems platform. EMC’s purchase of Cisco’s interest also tacitly acknowledges two underlying tensions in the converged infrastructure space:
By now you have at least seen the cute little elephant logo or you may have spent serious time with the basic components of Hadoop like HDFS, MapReduce, Hive, Pig and most recently YARN. But do you have a handle on Kafka, Rhino, Sentry, Impala, Oozie, Spark, Storm, Tez… Giraph? Do you need a Zookeeper? Apache has one of those too! For example, the latest version of Hortonworks Data Platform has over 20 Apache packages and reflects the chaos of the open source ecosystem. Cloudera, MapR, Pivotal, Microsoft and IBM all have their own products and open source additions while supporting various combinations of the Apache projects.
After hearing the confusion between Spark and Hadoop one too many times, I was inspired to write a report, The Hadoop Ecosystem Overview, Q4 2104. For those that have day jobs that don’t include constantly tracking Hadoop evolution, I dove in and worked with Hadoop vendors and trusted consultants to create a framework. We divided the complex Hadoop ecosystem into a core set of tools that all work closely with data stored in Hadoop File System and extended group of components that leverage but do not require it.
In the past, enterprise architects could afford to think big picture and that meant treating Hadoop as a single package of tools. Not any more – you need to understand the details to keep up in the age of the customer. Use our framework to help, but please read the report if you can as I include a lot more there.
Digital transformation will drive technology spending growth of 4.9%.Always-connected, technology-empowered customers are redefining sources of competitive advantage for AP organizations. In fact, 79% of business and technology decision-makers that Forrester surveyed indicated that improving the experience of technology-empowered customers will be a high or critical priority for their business in 2015. Similarly, 57% said that meeting consumers’ rising expectations was one of the reasons that they would spend more money on technology next year — the top reported reason for increased technology spending
An inquiry call from a digital strategy agency advising a client of theirs on data commercialization generated a lively discussion on strategies for taking data to market. With few best practices out there, the emerging opportunity just might feel like space exploration – going boldly where no man has gone before. The question is increasingly common. "We know we have data that would be of use to others but how do we know? And, which use cases should we pursue?" In It's Time To Take Your Data To Market published earlier this fall, my colleagues and I provided some guideance on identifying and commercializing that "Picasso in the attic." But the ideas around how to go-to-market continue to evolve.
In answer to the inquiry questions asked the other day, my advice was pretty simple: Don’t try to anticipate all possible uses of the data. Get started by making selected data sets available for people to play with, see what it can do, and talk about it to spread the word. However, there are some specific use cases that can kick-start the process.
Look to your existing customers.
The grass is not always greener, and your existing clients might just provide some fertile ground. A couple thoughts on ways your existing customers could use new data sources:
On October 31, IBM and Tencent announced that they will work together to extend Tencent’s public cloud platform to the enterprise by building and marketing an industry-oriented public cloud.
Don’t be fooled into looking at this move in isolation. With this partnership, IBM is turning to a new page of its transformation in China, responding to the challenges of a stricter regulatory environment, an increasingly consumerized technology landscape, and newly empowered customers. The move is a crucial milestone in IBM’s strategy to localize its vision for cloud, analytics, mobile, and social (CAMS). IBM has had a strategic focus on CAMS solutions and is systematically building an ecosystem on four pillars:
Cloud and social. This is where IBM and Tencent are a perfect match. IBM’s cloud managed service, operated by its partner 21ViaNet, officially went live on September 23. It can support mission-critical applications like ERP and CRM solutions from SAP and Oracle from both the IaaS and SaaS perspective. This could help Tencent target large enterprise customers beyond its traditional base of small and medium-size businesses (SMBs) and startups by adding social value to ERP, CRM, and EAM applications.
Digitally empowered customers — both businesses and consumers — wield a huge influence on enterprise strategies, policies, and customer-facing and internal processes. With mobile devices, the Internet, and all-but-unlimited access to information about products, services, prices, and deals, customers are now well informed about companies and their products, and are able to quickly find alternatives and use peer pressure to drive change. But not all organizations have readily embraced this new paradigm shift, desperately clinging to rigid policies and inflexible business processes. A common thread running through the profile of most of the companies that are not succeeding in this new day and age is an inability to manage change successfully. Business agility — reacting to fast-changing business needs — is what enables businesses to thrive amid ever-accelerating market changes and dynamics.
There just might be another 800-lb gorilla in the Business Intelligence market. In a year.
The popular cult book “Hitchhiker's Guide To The Galaxy” by Douglas Adams defines space as “. . . big. Really big. You just won't believe how vastly, hugely, mind-bogglingly big it is. . .” There are no better words to describe the size and the opportunity of the business intelligence market. Not only is it “mind-bogglingly big,” but over the last few decades we’ve only scratched the surface. Recent Forrester research shows that only 12% of global enterprise business and technology decision-makers are sure of their ability to transform and use information for better insights and decision making, and over half still have BI and analytics content sitting in siloed desktop-based shadow IT applications that are mostly based on spreadsheets.
The opportunity has provided fertile feeding ground to more than fifty vendors, including: full-stack software vendors like IBM, Microsoft, Oracle, and SAP, each with $1 billion-plus BI portfolios; SAS Institute, a multibillion BI and analytics specialist; popular BI vendors Actuate, Information Builders, MicroStrategy, Qlik, Tableau Software, and Tibco Software, each with hundreds of millions in BI revenues; as well as dozens of vendors ranging from early to late stage startups.
One of the developing trends in computing, relevant to both enterprise and service providers alike, is the notion of workload-specific or application-centric computing architectures. These architectures, optimized for specific workloads, promise improved efficiencies for running their targeted workloads, and by extension the services that they support. Earlier this year we covered the basics of this concept in “Optimize Scalable Workload-Specific Infrastructure for Customer Experiences”, and this week HP has announced a pair of server cartridges for their Moonshot system that exemplify this concept, as well as being representative of the next wave of ARM products that will emerge during the remainder of 2014 and into 2015 to tilt once more at the x86 windmill that currently dominates the computing landscape.
Specifically, HP has announced the ProLiant m400 Server Cartridge (m400) and the ProLiant m800 Server Cartridge (m800), both ARM-based servers packaged as cartridges for the HP Moonshot system, which can hold up to 45 of these cartridges in its approximately 4U enclosure. These servers are interesting from two perspectives – that they are both ARM-based products, one being the first tier-1 vendor offering of a 64-bit ARM CPU and that they are both being introduced with a specific workload target in mind for which they have been specifically optimized.
I’ve recently been thinking a lot about application-specific workloads and architectures (Optimize Scalalable Workload-Specific Infrastructure for Customer Experiences), and it got me to thinking about the extremes of the server spectrum – the very small and the very large as they apply to x86 servers. The range, and the variation in intended workloads is pretty spectacular as we diverge from the mean, which for the enterprise means a 2-socket Xeon server, usually in 1U or 2U form factors.
At the bottom, we find really tiny embedded servers, some with very non-traditional packaging. My favorite is probably the technology from Arnouse digital technology, a small boutique that produces computers primarily for military and industrial ruggedized environments.
Slightly bigger than a credit card, their BioDigital server is a rugged embedded server with up to 8 GB of RAM and 128 GB SSD and a very low power footprint. Based on an Atom-class CPU, thus is clearly not the choice for most workloads, but it is an exemplar of what happens when the workload is in a hostile environment and the computer maybe needs to be part of a man-carried or vehicle-mounted portable tactical or field system. While its creators are testing the waters for acceptance as a compute cluster with up to 4000 of them mounted in a standard rack, it’s likely that these will remain a niche product for applications requiring the intersection of small size, extreme ruggedness and complete x86 compatibility, which includes a wide range of applications from military to portable desktop modules.