HP Expands Its x86 Options With Mission-Critical Program – Defense And Offense Combined

Today HP announced a new set of technology programs and future products designed to move x86 server technology for both Windows and Linux more fully into the realm of truly mission-critical computing. My interpretation of these moves is that it is both a combined defensive and pro-active offensive action on HP’s part that will both protect them as their Itanium/HP-UX portfolio slowly declines as well as offer attractive and potentially unique options for both current and future customers who want to deploy increasingly critical services on x86 platforms.

What’s Coming?

Bearing in mind that the earliest of these elements will not be in place until approximately mid-2012, the key elements that HP is currently disclosing are:

ServiceGuard for Linux – This is a big win for Linux users on HP, and removes a major operational and architectural hurdle for HP-UX migrations. ServiceGuard is a highly regarded clustering and HA facility on HP-UX, and includes many features for local and geographically distributed HA. The lack of ServiceGuard is often cited as a risk in HP-UX migrations. The availability of ServiceGuard by mid-2012 will remove yet another barrier to smooth migration from HP-UX to Linux, and will help make sure that HP retains the business as it migrates from HP-UX.

Analysis engine for x86 – Analysis engine is internal software that provides system diagnostics, predictive failure analysis and self-repair on HP-UX systems. With an uncommitted delivery date, HP will port this to selected x86 servers. My guess is that since the analysis engine probably requires some level of hardware assist, the analysis engine will be paired with the next item on the list…

Read more

Another Reason Not To Cloud Wash - Real Cloud Services Are Maturing Fast

We know that enterprise infrastructure & operations (I&O) professionals are under tremendous executive pressure to get to yes on cloud computing and that this can be an uncomfortable proposition. Understanding the security, maturity and return on investment from cloud services can be challenging, and in many cases you might argue that you provide the same capabilities from your own data center. But there's no denying that enterprises are increasing their consumption of these services and that their value proposition is unique and compelling - if not to I&O directly.

Since cloud became a household word, vendors and enterprises alike have jumped to declare victory on cloud with services and infrastructure implementations that really don't deliver cloud value but have the same foundation - something we call "cloudwashing." This is a dangerous gambit as you claim legitimacy but don't activate the same economics, deliver the autonomy that cloud services offer to your internal users and aren't standardized or automated enough to deliver transformative agility. In other words you claim cloud but are achieving only incrementally better value. 

Read more

Data Scientist: Is This Really Science Or Just Pretension?

Every true scientist must also be a type of data scientist, although not all self-proclaimed data scientists are in fact true scientists.

True science is nothing without observational data. Without a fine-grained ability to sift, sort, structure, categorize, analyze, and present data, the scientist can’t bring coherence to their inquiry into the factual substrate of reality. Just as critical, a scientist who hasn’t drilled down into the heart of their data can’t effectively present or defend their findings.

Fundamentally, science is a collaborative activity of building and testing interpretive frameworks through controlled observation. At the heart of any science are the “controls” that help you isolate the key explanatory factors from those with little or no impact on the dependent variables of greatest interest. All branches of science rely on logical controls, such as adhering to the core scientific methods of hypothesis, measurement, and verification, as vetted through community controls such as peer review, refereed journals, and the like. Some branches of science, such as chemistry, rely largely on experimental controls. Some, such as astronomy, rely on the controls embedded in powerful instrumentation like space telescopes. Still others, such as the social sciences, may use experimental methods but rely principally on field observation and on statistical methods for finding correlations in complex behavioral data.

Read more

Have You Considered BI for IT Service Management?

A few months ago, I blogged about the fact that, while we were getting “excited” about Cloud and Social in the context of IT service management (ITSM), we were somewhat neglecting the impact of Mobile on our ability to deliver high-quality IT services (Social? Cloud? What About Mobile?). At the time, with the title of the blog tantamount to IT buzzword bingo, I chuckled to myself that all I needed was to throw in a reference to Big Data and I could have called “house.”

What do we do with all the data imprisoned within our ITSM tools?

Big Data? No, not really, more BI

While the Big Data perspective will be seen as a little too “large” from an ITSM tool data perspective (the Wikipedia definition of Big Data describes it as “data sets whose size is beyond the ability of commonly used software tools to capture, manage, and process the data within a tolerable elapsed time”), I can’t help think that these considerably smaller ITSM data sets are still ripe for the use of business intelligence (BI).

We have so much valuable data stored within our ITSM tools and, while we leverage existing reporting and analysis capabilities to identify trends and snapshots such as Top 10 problem areas, do we really mine the ITSM tool data to the best of our ability?

If we do (I can’t say I have had ITSM tool vendors making a song and dance about their capabilities), is it something that is both easy to implement and use?

Why am I bringing this up now? Are things changing?

Hopefully yes.

Read more

Dusting Off Our Content Security Crystal Ball

Winter is coming; the year is quickly drawing to a close, and its time to a look back and see how accurate our content security crystal ball was for 2011.  Last year we predicted three trends; two were accurate and one was partially correct. Let's take a closer look.

1)  Content security spending will slow down - We were right. According to our latest survey data, the content security budget represented 6% of the total IT security budget; this is a 1% decrease from 2010. Content security remains one of the lowest budgeted technology areas in IT.

2)  Consolidation will continue to drive suite offerings - We were partially correct. In 2011, we didn't see any significant M&A activity in the content security space.  While we were wrong on the vendor consolidation prediction, we were correct on the prediction that market leaders would increase their data loss prevention and mobile capabilities to further solidify their market positions.

3)  Mobile filtering will enter mainstream IT - We were correct. Laptop filtering is mainstream, and mobile device filtering is gaining momentum and getting significant attention. Content security vendors are currently testing content filtering on mobile phones and tablets.

What about 2012?  To see what five trends we predict will impact your strategy next year, check out the full document: "Content Security: 2012 Budget And Planning Guide."  Here's a teaser, is your content security strategy ready for the extended enterprise?

Fujitsu Forum 2011 In Munich: A Global Reset

During its Fujitsu Forum, which was attended by over 10,000 customers and partners, Fujitsu presented itself as a company in transformation from a fairly disjointed business to a more streamlined international business. Fujitsu’s new strategy has three main components:

  • Focus on organic growth: Fujitsu is investing more in its sales and services structure as well as its internal IT systems. It aims to get better in what it has already been doing, such as exploiting its large software and hardware portfolio, including smartphones, thin clients, handsets, tablets, mainframes, laptops, and super computers. In terms of services, Fujitsu is pushing its multivendor maintenance capabilities and its IT outsourcing experience. Fujitsu considers its product knowledge and near- and offshore mix a key, unique selling point vis-à-vis its competitors. Given Fujitsu's weak marketing and sales structures of the past, we would believe that it is high-time to improve its go-to-market approach.
  • Target emerging markets: The main focus is on Russia, India, and the Middle East. Fujitsu is ramping up local operations and also adapting its go-to-market approaches. For instance, in India it is using its promotion campaign via auto rickshaw on “see-try-buy” basis. Fujitsu’s goal is to double emerging markets sales by 2015 from €800 in 2010. Given its Asian roots, it is astonishing how long it took Fujitsu to realise the opportunities at its doorstep.
Read more

Why Does Mobility Need To Be Prioritized In Your IT Planning?

 

There are four main business and market drivers pushing IT to put – and keep - mobility front and center in their 2012 planning.

Enterprise mobility will dominate IP priorities in 2012. Moreover, this trend will continue during at least the next three years. Some of the big drivers for prioritizing mobility that we’ve identified during 2011 include:

1)      Users are demanding improved mobility support: This includes supporting more personal mobile devices (smartphones and tablets), expanding use of mobile apps both inside and outside the office, and supporting new mobile operating systems, especially Android and Apple iOS in addition to BlackBerry.  

2)      The business is finding ways to deploy apps they want without you: IT needs a strategy for prioritizing mobile apps development and deployment. The business also needs updated guidance about who pays for smartphones and tablets, and the associated mobile services, endpoint security, and appropriate use of personal devices.

3)      Customers are voracious about multichannel access to your content: Mobility will be key in social computing initiatives to drive deeper customer engagement. Customers (and suppliers) will love you for giving them great mobile apps like a product catalog, maintenance schedules, or project calendars accessible using their Internet-connected mobile device (smartphone, tablet).

Read more

What Is ADV And Why Do We Need It?

As one of the industry-renowned data visualization experts Edward Tufte once said, “The world is complex, dynamic, multidimensional; the paper is static, flat. How are we to represent the rich visual world of experience and measurement on mere flatland?” There’s indeed just too much information out there to be effectively analyzed by all categories of knowledge workers. More often than not, traditional tabular row-and-column reports do not paint the whole picture or — even worse — can lead an analyst to a wrong conclusion. There are multiple reasons to use data visualization; the three main ones are that one:

  • Cannot see a pattern without data visualization. Simply seeing numbers on a grid often does not tell the whole story; in the worst case, it can even lead one to a wrong conclusion. This is best demonstrated by Anscombe’s quartet, where four seemingly similar groups of x and y coordinates reveal very different patterns when represented in a graph.
  • Cannot fit all of the necessary data points onto a single screen. Even with the smallest reasonably readable font, single line spacing, and no grid, one cannot realistically fit more than a few thousand data points using numerical information only. When using advanced data visualization techniques, one can fit tens of thousands data points onto a single screen — a difference of an order of magnitude. In The Visual Display of Quantitative Information, Edward Tufte gives an example of more than 21,000 data points effectively displayed on a US map that fits onto a single screen.
Read more

Meeting The Challenges Of The Empowered Consumer

Retail is experiencing substantial change because consumers are now empowered by the web with information about price, availability, and merchandise features.

The retail industry is still served by solutions that are too fragmented to adequately balance the asymmetry introduced by radical price transparency. There are solutions for transactions, web site, stores, and so on but little to empower the cross-channel retailer to really meet the consumer’s needs.

I’ve recently been looking at IBM’s Smarter Commerce initiative and its portfolio that integrates:

1)    Store applications. IBM has well-established high-volume store apps appropriate to high-volume, low-touch retailing but correctly identifies these as inappropriate for fast-growing specialty retail with low-volume “high touch.” This is why it acquired the “asset” of Open Genius.

2)    Web metrics. IBM acquired Coremetrics in order to bring the discipline of measuring traffic, conversion, and average order to cross-channel retailing. It’s only by monitoring such metrics that retail can understand which marketing strategies are really successful and which market segments are most receptive.

3)    Direct-to-consumer initiatives. IBM acquired Unica as a platform for integrating automated direct-to-consumer marketing with its cross-channel offering.

Read more

Data Scientist: Do You Truly Need Big Data?

Data science has historically had to content itself with mere samples. Few data scientists have had the luxury of being able amass petabytes of data on every relevant variable of every entity in the population under study.

The big data revolution is making that constraint a thing of the past. Think of this new paradigm as “whole-population analytics,” rather than simply the ability to pivot, drill, and crunch into larger data sets. Over time, as the world evolves toward massively parallel approaches such as Hadoop, we will be able to do true 360-degree analysis. For example, as more of the world’s population takes to social networking and conducts more of its lives in public online forums, we will all have comprehensive, current, and detailed market intelligence on every demographic available as if it were a public resource. As the price of storage, processing, and bandwidth continue their inexorable decline, data scientists will be able to keep the entire population of all relevant polystructured information under their algorithmic microscopes, rather than have to rely on minimal samples, subsets, or other slivers.

Clearly, the big data revolution is fostering a powerful new type of data science. Having more comprehensive data sets at our disposal will enable more fine-grained long-tail analysis, microsegmentation, next best action, customer experience optimization, and digital marketing applications. It is speeding answers to any business question that requires detailed, interactive, multidimensional statistical analysis; aggregation, correlation, and analysis of historical and current data; modeling and simulation, what-if analysis, and forecasting of alternative future states; and semantic exploration of unstructured data, streaming information, and multimedia.

Read more