Software Requirements Are Where We Define Value

Revolutions take two forms. The most familiar kind is the noisy, conspicuous, disjunctive event that marks a clean break from the past. Yesterday, George III was our monarch. Today, he's not. The other kind of revolution is a more gradual and subtle event, when multiple forces pointing in the same direction push people into a new world. The shock of Pearl Harbor, the power vacuum left by a devastated Europe and Japan, a reinvigorated economy, and an aggressive superpower adversary made Americans feel, for the first time, that they needed to be far more deeply involved in international affairs than ever before. Without any formal declaration, Americans became internationalists after 1945.

Something like that second kind of revolution has happened with software requirements. Over the past decade or so, organizations grew increasingly worried about the problems that took root in bad requirements. The problems took many forms (portfolios filled with applications no one was using, users unhappy with the software that complicated their lives more than helped them, ideas that no one vetted carefully, etc.) and arose from just as many sources.

All of these discontents pointed in a common direction: Take requirements more seriously. In Forrester's Q1 2011 Application Development And Delivery Organization Structure Online Survey, "improvements of requirements" appeared at the top of the list of initiatives that would improve software development the most.

This quiet revolution in requirements is the topic of a recently published research document by Yours Truly. Many problems have inspired many new practices and disciplines, such as visualization, and even resuscitated older ones like model-based requirements. Correctly smelling a market opportunity, vendors have devised all kinds of cunning new tools. Some are highly specialized, the right tool for a specific job, like Balsamiq and iRise for visualization. Others combine many capabilities in Swiss army knife-like fashion, such as Blueprint and eDev. And there are a lot of these new requirements-focused companies, many of which didn't exist 10 years ago.

So what is the common direction in which all these problems and practices and tools are going? Value.

Throughout the entire software development life cycle, there is one place where you can (or should) find the definition of the software's value: requirements. They describe in the syntax of use cases, personas, nonfunctional requirements, road maps, prioritization, and other terms why the software will be valuable for the technology consumer, and why there's a compelling reason for the technology producer to build it. 

When requirements do a bad job of defining value, software development and delivery run a much higher risk of failure. We add new software to our portfolio in haste and repent at leisure. We build software that works but doesn't work for the people using it. We fail to communicate what realistic usage of the software looks like to the people testing it. We release new capabilities too slowly for our customers to solve their problems with it or too quickly to adapt to it.

The entire value chain needs a common vision of what we're building, for whom we're building it, and what kind of benefit it provides to both producer and consumer. Without it, smart people with good intentions and mad skillz are just as likely to guess the wrong test to write, the wrong user experience to design, or the wrong item to put at the top of the backlog as the right one. That's the change of mindset that has put requirements at the top of the list of priorities and made the people responsible for requirements willing to invest in new practices and tools.


Requirements gathering - There are not magic bullets

I could not agree with you more in regards to the value of properly constructed and usable requirements. The problem is, there are no magic bullets. Companies that are bad at requirements gathering can try and fail with any methodology. As much as people want to commoditize the software development life cycle, it just cannot be done. The root cause is that simple fact is PEOPLE. You can adopt any delivery model you want, but if you do not have the kinds of folks who are willing to dig into the processes BEFORE defining a system to improve those processes are doomed to failure. Developers or technology leaning professionals tend to see system based solutions before looking at process based solutions. When systems are developed with that mindset, you get a system that mirrors a bad process, and one that almost always perpetuates and exaggerates it.
In the end it comes down to the people gathering the requirements – do they document the process and question every step along the way. So what is the best way to gather requirements, ask the 5 “Whys” and use people with the skills to interview users in such a way as to get the core of the process

Hey Tom, great post! As you

Hey Tom, great post! As you correctly point out, the only fundamental reason for any IT project is to provide value for the company, and at the end of the day the emphasis on providing value should inform choices at each step of the process, straight from the formation of the product concept, through the feature list, and down to the requirement level. A colleague of mine at Seilevel wrote an interesting blog post that may interest your readers that addresses this subject. It's about how we should be thinking about "value" when doing legacy conversion projects, when the value-add may not be through new features or functionality but by reducing licensing costs, say, or improving business processes. Check it out, here:

Thanks again!