What Happens When Central "IT" No Longer Exists?

When we get used to something, we often think it will never change, but it does eventually; who bought a house in 2006 and assumed the value would surely keep going up?

We are working at an architectural inflection point. The signals are all around us – cloud, big data, mobility, smart computing, etc. While each of these appears to be only modestly connected, I think together they signify a major shift in how business gets done and in the architecture that supports it. If true, this means the tried-and-true Business-Data-Applications-Technology model architected and delivered by central IT will not serve us much longer.

Consider the following:

  • Big and complex are here to stay. In the past we strove for simplicity because we did not have the techniques and technology to deal with the world as it is – infinitely complex. Read Chaos: Making a New Science by James Gleick. The cloud has brought the power of distributed, elastic computing to bear on enormous problems, and this trend will continue. Will central IT continue to grow in response to the increasing size and complexity of technology problems, or will a different model arise?
  • The cloud and the App Internet are two sides of the same coin. The cloud is about optimizing the power of centralized data processing, while the App Internet is about exploiting the enormous power of mobile devices on the periphery. What happens when we figure out how these work together? Can we create a smart grid across mobile devices that also leverages cloud resources? What can we accomplish when apps no longer live in central data centers that we own and control?
Read more

Is Moore's Law Still Valid?

Has anybody noticed that processor speed has stopped doubling every 18 months? This occurred to me the other day, so I took some time to figure out why and draw some conclusions about Moore's law and the impacts of continued advances in chip technology. Here what I've come up with: 1) Moore's law is still valid, but the way processor power is measured has changed, 2) disk-based memory is going the way of the cassette tape, and 3) applications will move into the cloud.

We have pushed semiconductor technology to its physical limits, including our ability to cool chips and the speed of light. As a result, chip manufacturers have turned to multicore processing technology rather than pure chip and bus speed. Now the power of a microprocessor is judged by the number of cores it contains — and the number of cores on a single chip will continue to increase for the near future.

So what? Extra cores per chip means more parallel processing to speed through operations — so parallel is the future.

Two other trends are also important to understand my conclusions:

  1. RAM keeps getting more powerful and cheaper.
  2. As the number of cores in a chip goes up, its ability to process data begins to exceed bus technology’s ability to deliver it. Bus speed is governed by Moore’s law.
Read more

Blogging From the IBM Big Data Symposium - Big Is More Than Just Big

Just attended a Big Data symposium courtesy of IBM and thought I’d share a few insights, as probably many of you have heard the term but are not sure what it means to you.

No. 1: Big Data is about looking out of the front window when you drive, not the rearview mirror. What do I mean? The typical decision-making process goes something like this: capture some data, integrate it together, analyze the clean and integrated data, make some decisions, execute. By the time you decide and execute, the data may be too old and have cost you too much. It’s a bit like driving by looking out of your rearview mirror.

Big Data changes this paradigm by allowing you to iteratively sift through data at extreme scale in the wild and draw insights closer to real time. This is a very good thing, and companies that do it well will beat those that don’t.

No. 2: Big is not just big volume. The term “Big Data” is a misnomer and it is causing some confusion. Several of us here at Forrester have been saying for a while that it is about the four “V’s" of data at extreme scale - volume, velocity, variety and variability. I was relieved when IBM came up with three of them; variability being the one they left out.

Some of the most interesting examples we discussed centered on the last 3 V’s – we heard from a researcher who is collecting data on vital signs from prenatal babies and correlating changes in heart rates with early signs of infection. According to her, they collect 90 million data points per patient per day! What do you do with that stream of information? How do you use it to save lives? It is a Big Data Problem.

Read more

Not Your Grandfather’s Data Warehouse

As I dig into my initial research, it dawned on me – some technology trends are having an impact on information management/data warehouse (DW) architectures, and EAs should consider these when planning out their firm’s road map. The next thought I had – this wasn’t completely obvious when I began. The final thought? As the EA role analyst covering emerging technology and trends, this is the kind of material I need to be writing about.

Let me explain:

No. 1: Big Data expands the scope of DWs. A challenge with typical data management approaches is that they are not suited to dealing with data that is poorly structured, sparsely attributed, and high-volume. For example, today’s DW appliances boast abilities to handle up to a 100 TB of volume, but the data must be transformed into a highly structured format to be useful. Big Data technology applies the power of massively parallel distributed computing to capture and sift through data gone wild – that is, data at an extreme scale of volume, velocity, and variability. Big Data technology does not deliver insight, however – insights depend on analytics that result from combing the results of things like Hadoop MapReduce jobs with manageable “small data” already in your DW.

Even the notion of a DW is changing when we start to think “Big” – Apache just graduated Hivefrom being part of Hadoop to its own project (Hive is a DW framework for Big Data). If you have any doubt, read James Kobielus’ “The Forrester Wave™: Enterprise Data Warehousing Platforms, Q1 2011.”

Read more

Closing The Innovation Gap

Greetings — thanks for taking the time to read my inaugural blog! Let me introduce myself by way of continuing a discussion that I started at Practicing EA and CIO.com on innovation and technology that I think strikes at the heart of our challenges as enterprise architects. It also provides a good context for my future research, which I discuss at the end.

Closing The Innovation Gap

In part 1 of this post, I claim that a gap opened while we were fighting the overly complex, expensive current state and trying to help our business partners innovate with new technology.

The gap – We cannot deliver new technology and innovation quickly or cheaply enough.

Shadow IT Is The Symptom, Not The Cause

  • The Symptom – We often blame Shadow IT and manual workarounds for increases in complexity, reduction in quality of service, and obscuring true technology costs. These are symptoms of the problem, not the problem itself.
  • The Cause – Business users know more about what they need and when they need it and are the most motivated to solve their problems now, not once the budget cycle gets around to funding a project. Central IT, where most EAs practice, is a knowledge store for designing enterprise-scale systems but is constrained in its ability to deliver.
Read more