What a strange summer this has been! From Boston to London to Paris to Turin, the weather has offered weekly and even daily reversals, with continuous change from sun to rain, from hot and damp to cool and crisp. I missed a nice spring season. Even today, from 35º-38º Celsius (95º-100º Fahrenheit), we just went to 22º Celsius (71º Fahrenheit) with a perfect storm! A continuous climate and sudden change is quite unusual in some of these countries. Certainly it is where the Azores Anticyclone usually dominates from mid-late June to mid-late August, offering a stable summer. How many times have you had to change plans because you discover weather is about to change!?
You might be thinking, "What does this have to do with this AD&D blog?" It’s about change! I am wondering if, in our daily lives, getting used to unexpected conditions and having to handle continuous change favors a mindset where change is just something we have to deal with and not fight. A new mindset very much needed given the change we see ahead in how we develop, test, and deploy software!
My focus in this blog is testing, although the first change we need to get used to is that we can’t talk any longer about testing in an isolated fashion! Testing is getting more and more interconnected in a continuous feedback loop with development and deployment. (See my colleague Kurt Bittner's report on continuous delivery; I could not agree more with what Kurt says there!)
I recently received a direct mail piece from one of my favorite retailers with a massive ad in that proclaimed "We Beat Internet Prices." Now, I am a big fan of straightforward and robust value propositions, but these types of brand exclamations are antiquated and add little value to customers, mainly because they simply reward customers for being good bargain hunters. Instead of simply stating you beat your competitor’s prices, employing strategic pricing and customer engagement initiatives creates real distinct value to your customers by:
Showing them you can execute on your low price promise and not just talk about it. Employing a holistic pricing strategy meets your customer’s price expectations can indicate to your customers that you are truly ‘walking the walk’ when it comes to offering the lowest price.
Building your credibility. Understanding your customers’ needs and offering solutions that facilitate decisions and generate engagement builds credibility. Simply shouting that you match Internet prices does little to build credibility with your customers.
Helping them with real problems. Shoppers don’t need guidance on finding the lowest price -- they need to understand how your brand and solution help them compared to your competition.
I just finished my new report on the Agile testing tools landscape. I’ll point Forrester readers to it as soon as it publishes. But there are few things that have struck me since I took over the software quality and testing research coverage at Forrester and which I would like to share with you in this preview of my findings of the testing tools landscape doc.
My research focus area was initially on software development life cycles (SDLCs) with a main focus on Agile and Lean. In fact, my main contribution in the past 12 months has been to the Forrester Agile and Lean playbook, where all my testing research has also focused. Among other reasons, I took the testing research area because testing was becoming more and more a discipline for software developers. So it all made sense for me to extend my software development research focus with testing. But I was not sure how deep testing was really going to integrate with development. My concern was that I’d have to spend too much time on the traditional testing standards, processes, and practices and little on new and more advanced development and testing practices. After 12 months, I am happy to say that it was the right bet! My published recent research shows the shift testing is making, and so does the testing tool landscape document, and here is why:
Early this year, on January 15, I published our first research on testing for the Agile and Lean playbook. Connected to that research, my colleague Margo Visitacion and I also published a self-assessment testing toolkit. The toolkit helps app-dev and testing leaders understand how mature their current testing practices and organization are for Agile and Lean development.
The Agile Testing Self-Assessment Toolkit
So what are the necessary elements to assess Agile testing maturity? Looking to compromise between simplicity and comprehensiveness, we focused on the following:
Testing team behavior. Some of the questions we ask here look at collaboration around testing among all roles in the Scrum teams. We also ask about unit testing: Is it a mandatory task for developers? Are all of the repeititive tests that can be run over and over at each regression testing automated?
Organization. In our earlier Agile testing research, we noticed a change in the way testing gets organized when Agile is being adopted. So here we look at the role test managers are playing: Are they focusing more on being coaches and change agents to accelerate adoption of the new Agile testing practices? Or are managers still operating in a command-and-control regime? Is the number of manual testers decreasing? Are testing centers of excellence (TCOEs) shifting to become testing practice centers of excellence (TPCOEs)?
DevOps is a movement for developers and operations professionals that encourages more collaboration and release automation. Why? To keep up with the faster application delivery pace of Agile. In fact, with Agile, as development teams deliver faster and in shorter cycles, IT operations finds itself unprepared to keep up with the new pace. For operations teams, managing a continuous stream of software delivery with traditional manual-based processes is Mission Impossible. Vendors have responded to DevOps requirements with more automation in their release management, delivery, and deployment tools. However, there is a key process that sits between development and operations that seems to have been given little attention: testing.
In fact, some key testing activities, like integration testing and end-to-end performance testing, are caught right in the middle of the handover process between development and operations. In the Agile and Lean playbook, I’ve dedicated my latest research precisely to Agile testing, because I’ve seen testing as the black beast in many transformations to Agile because it was initially ignored.
More and more data is stored online by both consumers and businesses. The convenience of using services such as Dropbox, Box, Google Drive, Microsoft Live Skydrive, and SugarSync is indisputable. But, is it safe? All of the services certainly require a user password to access folders, and some of the services even encrypt the stored files. Dropbox reassures customers, "Other Dropbox users can't see your private files in Dropbox unless you deliberately invite them or put them in your Public folder."
The security measures employed by these file-synching and sharing services are all well and good, but they can be instantly, innocently neutered by a distracted programmer. Goodbye privacy. All your personal files, customer lists, business plans, and top-secret product designs become available for all the world to see. How can this happen even though these services are sophisticated authetication and encryption technologies? The answer: a careless bug introduced in the code.
Below is some Java code I wrote for a fictitious file-sharing service called CloudCabinet to demonstrate how this can happen. Imagine a distracted programmer texting her girlfriend on her iPhone while cutting and pasting Java code. Even non-Java programmers should be able to find the error in the code below.
The essential shape of the enterprise marketing landscape hasn’t changed much over the years. In last week’s Revisiting The Enterprise Marketing Software Landscape, I dissect technologies into the four basic categories of marketing management, brand management, relationship marketing, and interactive marketing. Consumers are rapidly changing behaviors, and marketing as a practice is evolving dramatically, but the technologies that marketers buy continue to come in essentially the same containers.
Notice, however, all of the decision management systems employed across the marketing landscape. From interaction management to online testing to recommendations to contact optimization, marketers are using automated systems to make an increasing number of customer-facing decisions. Viewed from the perspective of those decisions, the landscape of marketing technologies is shifting under our feet.
So is it time for a new take – say, customer decision management (CDM) – on marketing technology?
Security threats develop and evolve with startling rapidity, with the attackers always seeking to stay one step ahead of the S&R professional. The agility of our aggressors is understandable; they do not have the same service-focused restrictions that most organizations have, and they seek to find and exploit individual weaknesses in the vast sea of interconnecting technology that is our computing infrastructure.
If we are to stand a chance of breaking even in this game, we have to learn our lessons and ensure that we don’t repeat the same mistakes over and over. Unfortunately, it is alarmingly common to see well known vulnerabilities and weakness being baked right in to new applications and systems – just as if the past 5 years had never happened!
A recent report released by Alex Hopkins of Context Information Security shines a light on the vulnerabilities they discovered while testing almost 600 pre-release web applications during 2011. The headlines for me were:
On average, the number of issues discovered per application is on the rise.
Two-thirds of web applications were affected by cross site scripting (XSS).
Nearly one in five web applications were vulnerable to SQL injection.
It makes depressing reading, but I’m interested in why this situation is occurring:
Are S&R professionals simply not educating and guiding application developers?
Are application developers ignoring the training and education? Are we teaching them the wrong things or do we struggle to explain the threats from XSS and SQL injection?
Are our internal testing regimes failing, allowing flawed code to reach release candidate stage?
Seriously. I recently spoke with a client who swears that software quality improved once they got rid of the QA team. Instead of making QA responsible for quality, they put the responsibility squarely on the backs of the developers producing the software. This seems to go against conventional wisdom about quality software and developers: Don't trust developers. Or, borrowing from Ronald Reagan, trust but verify.
This client is no slouch, either. The applications provide real-time market data for financial markets, and the client does more than 40 software releases per year. If the market data produced by an application were unavailable or inaccurate, then the financial market it serves would crumble. Availability and accuracy of information are absolute. This app can't go down, and it can't be wrong.
Why Does This Work?
The client said that this works because the developers know that they have 100% responsibility for the application. If it doesn't work, the developers can't say that "QA didn't catch the problem." There is no QA team to blame. The buck stops with the application development team. They better get it right, or heads will roll.
As British author Samuel Johnson famously put it, "The prospect of being hanged focuses the mind wonderfully."
Recently, I’ve been getting more inquiries around risk based testing. In addition to agile test methods and test estimation, test teams turning their eyes to risk based testing is just another positive step in integrating quality through out the SDLC. Yes, I still see QA engineers as having to put their evangelist hats on to educate their developer brothers and sisters that quality is more than just testing (don’t get me wrong, consistent unit and integration testing is a beautiful thing), however, any time that business and technology partners can think about impact and dependencies in their approach to a solid, workable application elevates quality to the next level.
Keep asking those questions about risk based testing – and make sure that you’re covering all of the angles. Make sure that you’re covering: