Follow The Conversation From Forrester's IT Forum 2011

Today we’re kicking off Forrester's IT Forum 2011 at The Palazzo in Las Vegas. Prepare for three exciting days of keynote presentations and track sessions focused on business and technology alignment. Use the Twitter widget below to follow the Forum conversation by tracking our event hashtag #ITF11 on Twitter. Attendees are encouraged to tweet throughout the Forum and to tweet any questions for our keynote presenters to #ITF11.

Is Moore's Law Still Valid?

Has anybody noticed that processor speed has stopped doubling every 18 months? This occurred to me the other day, so I took some time to figure out why and draw some conclusions about Moore's law and the impacts of continued advances in chip technology. Here what I've come up with: 1) Moore's law is still valid, but the way processor power is measured has changed, 2) disk-based memory is going the way of the cassette tape, and 3) applications will move into the cloud.

We have pushed semiconductor technology to its physical limits, including our ability to cool chips and the speed of light. As a result, chip manufacturers have turned to multicore processing technology rather than pure chip and bus speed. Now the power of a microprocessor is judged by the number of cores it contains — and the number of cores on a single chip will continue to increase for the near future.

So what? Extra cores per chip means more parallel processing to speed through operations — so parallel is the future.

Two other trends are also important to understand my conclusions:

  1. RAM keeps getting more powerful and cheaper.
  2. As the number of cores in a chip goes up, its ability to process data begins to exceed bus technology’s ability to deliver it. Bus speed is governed by Moore’s law.
Read more

Blogging From the IBM Big Data Symposium - Big Is More Than Just Big

Just attended a Big Data symposium courtesy of IBM and thought I’d share a few insights, as probably many of you have heard the term but are not sure what it means to you.

No. 1: Big Data is about looking out of the front window when you drive, not the rearview mirror. What do I mean? The typical decision-making process goes something like this: capture some data, integrate it together, analyze the clean and integrated data, make some decisions, execute. By the time you decide and execute, the data may be too old and have cost you too much. It’s a bit like driving by looking out of your rearview mirror.

Big Data changes this paradigm by allowing you to iteratively sift through data at extreme scale in the wild and draw insights closer to real time. This is a very good thing, and companies that do it well will beat those that don’t.

No. 2: Big is not just big volume. The term “Big Data” is a misnomer and it is causing some confusion. Several of us here at Forrester have been saying for a while that it is about the four “V’s" of data at extreme scale - volume, velocity, variety and variability. I was relieved when IBM came up with three of them; variability being the one they left out.

Some of the most interesting examples we discussed centered on the last 3 V’s – we heard from a researcher who is collecting data on vital signs from prenatal babies and correlating changes in heart rates with early signs of infection. According to her, they collect 90 million data points per patient per day! What do you do with that stream of information? How do you use it to save lives? It is a Big Data Problem.

Read more

Not Your Grandfather’s Data Warehouse

As I dig into my initial research, it dawned on me – some technology trends are having an impact on information management/data warehouse (DW) architectures, and EAs should consider these when planning out their firm’s road map. The next thought I had – this wasn’t completely obvious when I began. The final thought? As the EA role analyst covering emerging technology and trends, this is the kind of material I need to be writing about.

Let me explain:

No. 1: Big Data expands the scope of DWs. A challenge with typical data management approaches is that they are not suited to dealing with data that is poorly structured, sparsely attributed, and high-volume. For example, today’s DW appliances boast abilities to handle up to a 100 TB of volume, but the data must be transformed into a highly structured format to be useful. Big Data technology applies the power of massively parallel distributed computing to capture and sift through data gone wild – that is, data at an extreme scale of volume, velocity, and variability. Big Data technology does not deliver insight, however – insights depend on analytics that result from combing the results of things like Hadoop MapReduce jobs with manageable “small data” already in your DW.

Even the notion of a DW is changing when we start to think “Big” – Apache just graduated Hivefrom being part of Hadoop to its own project (Hive is a DW framework for Big Data). If you have any doubt, read James Kobielus’ “The Forrester Wave™: Enterprise Data Warehousing Platforms, Q1 2011.”

Read more