Cloud Computing Will Save IT Millions, But Only If You Have Elastic Applications

Do you keep every single light on in your house even though you are fast asleep in your bedroom?

Of course you don't. That would be an abject waste. Then why do most firms deploy peak capacity infrastructure resources that run around the clock even though their applications have distinct usage patterns? Sometimes the applications are sleeping (low usage). At other times, they are huffing and puffing under the stampede of glorious customers. The answer is because they have no choice. Application developers and infrastructure operations pros collaborate (call it DevOps if you want) to determine the infrastucture that will be needed to meet peak demand.

  • One server, two server, three server, four.
  • The business is happy when the web traffic pedal is to the floor.
Read more

We Need Some New Categories Of Testing Software For Humans

During my research for the just-published document "For Developers, Dog Food And Champagne Can't Be The Only Items On The Menu," I had an interesting conversation with the team at Adobe that handles internal pilots, which in their case entails more than just putting the next version of an Adobe product on the network for people to try. Instead of the typical "spaghetti against the wall" approach to "eating your own dog food" (to mix food metaphors), the Adobe team actively looks for use cases that fit the product. If a product like Flex or Photoshop is a tool, then it should be the right tool for some job. (And if you can't find any use for the software, you're definitely in trouble.)

This approach might require additional work above the "spaghetti against the wall" approach, but it definitely has dividends for many different groups. The product team identifies functionality gaps or usability flaws. Marketers and salespeople have a much easier time figuring out what to demo. As a result, Adobe runs a better chance of both building technology that's compelling, and then explaining what's compelling about it to potential customers.

Read more

Painting The IT Industry Landscape

All of us in the technology industry get caught up in the near-term fluctuations and pressures of our business. This quarter’s earnings, next quarter’s shipments, this year’s hiring plan . . . it’s easy to get swallowed up by the flood of immediate concerns. So one of the things that we work hard on at Forrester, and that our clients value in their relationships with us, is taking a few steps back and looking at the longer-term, bigger picture of the size and shape of the industry’s trajectory. It provides strategic and financial context for the short-term fluctuations and trends that buffet all of us.

I am lucky to co-lead research in Forrester's Vendor Strategy team, which is explicitly chartered to predict and quantify the new growth opportunities and disruptions facing strategists at some of our leading clients. We will put those predictions on display later this month at Forrester's IT Forum, our flagship client event. Among the sessions that Vendor Strategy analysts will be leading:

  • "The Software Industry in Transition": Holger Kisker will preview his latest research detailing best practices for software vendors navigating the tricky transition from traditional license to as-a-service pricing and engagement models.
  • "The Computing Technologies of 2016": Frank Gillett will put us in a time machine for a trip five years into the future of computing, storage, network, and component technologies that will underpin new applications, new experiences, and new computing capabilities.
Read more

Why Disqus, And Pre-Existing Social Identities, Matter To B2B Marketers

[co-authored by Zachary Reiss-Davis]

Disqus, a SaaS commenting platform that companies can embed in any website page, just announced a funding round of $10 million yesterday. In the same announcement, they stated that they are reaching 500 million unique visitors a month across 750,000 different websites, including major media sites like both CNN and Fox News, as well as many high tech news sources, such as ReadWriteWeb (which wrote an article on the announcement).

As a B2B marketer, why does this matter to you? Because B2B sites can learn from these largely media or consumer examples. B2B sites that want to enable community and commenting on their pages, including blog posts, need to make it extremely simple to engage using whichever of your social identities (and resulting social networks) you want to bring to the site. 

Requiring a unique login in order to get an IT developer to share feedback on your new server architecture, for example, makes it easier to capture information in your CRM system, but your visitors want to add value to their existing social identities. Allowing visitors to engage with you using their preferred identities creates a valuable service for them by strengthening their preferred online identity. See the screenshot below for what this looks like with Disqus. This may increase their willingness to engage on your website instead of across the Internet. 

Read more

Not Your Grandfather’s Data Warehouse

As I dig into my initial research, it dawned on me – some technology trends are having an impact on information management/data warehouse (DW) architectures, and EAs should consider these when planning out their firm’s road map. The next thought I had – this wasn’t completely obvious when I began. The final thought? As the EA role analyst covering emerging technology and trends, this is the kind of material I need to be writing about.

Let me explain:

No. 1: Big Data expands the scope of DWs. A challenge with typical data management approaches is that they are not suited to dealing with data that is poorly structured, sparsely attributed, and high-volume. For example, today’s DW appliances boast abilities to handle up to a 100 TB of volume, but the data must be transformed into a highly structured format to be useful. Big Data technology applies the power of massively parallel distributed computing to capture and sift through data gone wild – that is, data at an extreme scale of volume, velocity, and variability. Big Data technology does not deliver insight, however – insights depend on analytics that result from combing the results of things like Hadoop MapReduce jobs with manageable “small data” already in your DW.

Even the notion of a DW is changing when we start to think “Big” – Apache just graduated Hivefrom being part of Hadoop to its own project (Hive is a DW framework for Big Data). If you have any doubt, read James Kobielus’ “The Forrester Wave™: Enterprise Data Warehousing Platforms, Q1 2011.”

Read more

Software Development And The Transparent Company (Part II)

[For the first part in this series, click here.]

Recently, I spoke with a major airline about their adoption of Agile, which they considered critical for a major customer loyalty project. Based on previous experience, the dev team expected the business users involved in this project to move slowly, so they saw Agile as a strategy for being ready to pounce on any opportunity to make progress. How slowly? The current estimation for the project's completion was....[drumroll]...five years. Now that's a customer loyalty program ensured to be left with just the most loyal customers imaginable.

As hair-raising a situation as this might be, it's hardly unique. App dev teams contributing embedded software elements to hardware products must time their deliverables to arrive at key landmarks in the overall release schedule. Compliance requirements weigh down software development with extra documentation and validation. Flawed requirements force teams to go back to the drawing board. Dev managers live and die by the schedule, and there's always something that could jeopardize the schedule. Development is pretty pointless unless someone delivers the bits and bytes, but dev ops still remains a relatively mysterious and unpredictable process for dev teams, over which they have little control once they throw their code over the wall.

Read more

Intel Shows the Way Forward, Demos 22 nm Parts with Breakthrough Semiconductor Design

What Intel said and showed

Intel has been publishing research for about a decade on what they call “3D Trigate” transistors, which held out the hope for both improved performance as well as power efficiency. Today Intel revealed details of its commercialization of this research in its upcoming 22 nm process as well as demonstrating actual systems based on 22 nm CPU parts.

The new products, under the internal name of “Ivy Bridge”, are the process shrink of the recently announced Sandy Bridge architecture in the next “Tock” cycle of the famous Intel “Tick-Tock” design methodology, where the “Tick” is a new optimized architecture and the “Tock” is the shrinking of this architecture onto then next generation semiconductor process.

What makes these Trigate transistors so innovative is the fact that they change the fundamental geometry of the semiconductors from a basically flat “planar” design to one with more vertical structure, earning them the description of “3D”. For users the concepts are simpler to understand – this new transistor design, which will become the standard across all of Intel’s products moving forward, delivers some fundamental benefits to CPUs implemented with them:

  • Leakage current is reduced to near zero, resulting in very efficient operation for system in an idle state.
  • Power consumption at equivalent performance is reduced by approximately 50% from Sandy Bridge’s already improved results with its 32 nm process.
Read more

A Few Thoughts On Communicating Risk

In my new report, The Risk Manager's Handbook: How To Measure And Understand Risks, I present industry best practices and guidance on ways to articulate the extent or size of a risk. More than the interpersonal, political, and leadership skills required of a risk management professional, defining how risks are measured and communicated is where I believe they prove their worth. If risk measurement techniques are too complicated, they may discourage crucial input from colleagues and subject matter experts... but if they are too simple, they won't yield enough relevant information to guide important business decisions. Great communication skills can only hide irrelevant information for so long.

This report includes factors to use in the risk measurement process, ways to present risk measurement data in meaningful ways, and criteria to use when deciding which of these methods are most appropriate. As always, your feedback is welcome and appreciated.

In addition, I will be covering a related topic with our Security and Risk Council in a session called Creating A High-Impact Executive Report along with my colleague Ed Ferrara at Forrester's upcoming IT Forum: Accelerate At The Intersection Of Business And Technology, May 25-27, in Las Vegas. Please join us if you can make it. Later in the week, I will be available for 1-on-1 meetings with attendees, and I'll also present sessions on linking goverannce and risk and establishing good vendor risk management practices. I hope to see you there. 

Categories:

CIOs: At What Stage Is Your Thinking On Cloud Economics?

Is your cloud strategy centered on saving money or fueling revenue growth? Where you land on this question could determine a lot about your experience level with cloud services and what guidance you should be giving to your application developers and infrastructure & operations teams. According to our research the majority of CIOs would vote for the savings, seeing cloud computing as an evolution of outsourcing and hosting that can drive down capital and operations expenses. In some cases this is correct but in many the opposite will result. Using the cloud wrong may raise your costs.

But this isn’t a debate worth having because it’s the exploration of the use cases where it does save you money that bears the real fruit. And it’s through this experience that you can start shifting your thinking from cost savings to revenue opportunities. Forrester surveys show that the top reasons developers tap into cloud services (and the empowered non-developers in your business units) is to rapidly deploy new services and capabilities. And the drivers behind these efforts – new services, better customer experience and improved productivity. Translation: Revenues and profits.

If the cloud is bringing new money in the door, does it really matter if it’s the cheaper solution? Not at first. But over time using cloud as a revenue engine doesn’t necessarily mean high margins on that revenue. That’s where your experience with the cost advantaged uses of cloud come in.

Read more