After working with ThoughtWorks I’m completely sold on Agile software development for the type of product we deliver. Which if you know me may be a bit of a surprise as I have a reputation for being a cynical old curmudgeon, and a staunch doubter to boot.
While I am still unsure if agile would work for large scale software projects (think air traffic control) I am certain that there are agile practices (lean for example) that could be pilfered and applied to those lumbering projects too. If you are reading this and you haven’t had any exposure to Agile, I guess the biggest thing I should tell you is that this isn’t some process you just pick up and run with, nor is it a methodology to apply. You can’t learn it from a book, or by a project template; it is quite simply a mindset and as such it presents the practitioner with quite a shift in paradigm. Simply put, you can’t flick a big agile switch and hey presto you are Agile; it is more subtle than that.
I think before I continue it’s important that I clarify something. The term agile is applied to a wide variety of processes, techniques, methods, tools, practices, projects, and phases of the development life cycle; it has become a buzzword used by people trying to paint their work in a new light (who doesn't want to be known as being agile?). It's important, therefore, to set out some basic definitions and context for the use of the term "agile," especially as i will use it constantly throughout this article.
Within the context of software development, the term "agile" (with a small "a") is meant to imply that the development team is nimble, flexible and responsive to the business needs, and that it is able to adopt new technologies and techniques that can improve software delivery. The term "Agile" (with a capital "A") refers to a very specific set of processes (and i use the term process as more of a place holder) applied to software development that have evolved over the past fifteen years or so; including some you have probably heard of, like eXtreme Programming (XP), Scrum, Feature-driven Development (FDD), Crystal, Dynamic Systems Development Method (DSDM) and Lean Software Development. A non-profit organisation The Agile Alliance was created by the ideas people behind most of the Agile processes. The Agile Alliance promotes a set of core values that a process must follow to be called Agile:
- Individuals and interactions over processes and tools
- Working software over comprehensive documentation
- Customer collaboration over contract negotiation
- Responding to change over following a plan
So to be Agile then, a process must support these values (and more), albeit in diverse ways. Some, processes for example like Scrum, address team management, while others such as XP or DSDM, address development activities or other activities of the software development life cycle. It’s worth making a mental note that users of Agile processes do not have to follow all its Agile practices, and neither does the use of one process preclude the use of any other. One thing I learnt is that Agile supports such process change, if a particular way of working is not working then change. In fact many I found that many of the Agile practices are complimentary.
No matter your preference, all of the different flavours of Agile will deliver working functionality in short time-boxed iterations. They implement early and frequent testing. They require lots of involvement from the customer on a frequent if not full-time basis and they assume that the customers requirements will continually change.
So why all the fuss about big "A" little "a" when talking about Agile? There are two main reasons:
- Companies who adopt Agile processes have to be prepared to completely change the way they not only develop software but how they think, and this change is so big it is very hard to do. However companies that have achieved this shift will appreciate the significant benefits they can reap in productivity, quality and value of the software that they deliver. Notice i didn't say it would be faster. These companies are Agile with a capital A.
- Companies that are not able to embark on this level of change can still become more reactive and flexible in the ways that they build software. They become more agile and begin to realise the advantage that Agile can deliver. This toe dipping exercise can lead to a true Agile team, however the company must understand that it's not the prefferable way, and they would benefit from the gung-ho approach to Agile adoption.
To reiterate what i've said, at a minimum, every Agile process delivers working functionality in short, time-boxed iterations. Agile implements early and frequent testing, and it involves the customer on a frequent if not full-time basis; and assumes that requirements cannot be fully defined at the start of a project, and that the requirements will continually change. The ethos is simple that by using these practices, the development teams will be able to respond quickly to the ever changing customer priorities and feedback, and deliver value to the business. This is often missunderstood as being quicker, I would prefer to describe it as improved time to benefits.
As a tester I have become quite accustomed to being involved late in a project, often right before delivery. I have adapted and learnt how to cope with shortened time frames for testing, and receiving specifications and requirement documentation that don’t actually match the product delivered in QA. Typically the business (or for that matter the project team) usually has little interest in the test teams input, that is until they call us a bottle neck.
Feeling loved and appreciated.
Agile software development for a tester is radically different from the traditional PRINCE(2)/waterfall software development lifecycle (SDLC), because it throws QA right into the heart of project on day one. As Agile testers we suddenly found ourselves being involved in the analysis and design of the product. We became heavily involved in decision-making throughout the whole project and because the delivery of the software is incremental we found that we began testing at the very beginning of the project and that we had to maintain pace and keep step with development to prevent any delay. This is a far cry from waiting for a software deliverable (possibly unfit for purpose) to be thrown over the fence a few weeks before go live.
The paradigm shift I have mentioned is a large one, not just for QA but the whole project team, because our QA team now drives the entire software development process.
Quality assurance, with its focus on preventing defects, is translated into the agile practice of having committed QA resources on the development team that participate in decision-making on a daily basis, throughout the life cycle of the project.
Their input during elaboration and design helps developers write better code. More “what-if” scenarios are considered and planned for, as the collaboration between coders and testers gives the coders more insights than if they were to have planned the work on their own.
Likewise, the testers gain added insight into the expected functionality from the coders and the product owner, and are able to write more effective test cases for the product.
Quality control places its emphasis on finding defects that have already slipped into the system, and working with developers to eliminate those defects. This bug-checking is done within the iteration, using such techniques such as daily builds and smoke tests, automated regression testing, unit testing, functional and exploratory testing, and acceptance testing. Everyone participates – no one is exempt from the tasks of ensuring that the feature coded meets the customer’s expectations.
Your role morphs and evolves.
Through a methodology known as "Story Test Driven Development" the test requirements (aka the acceptance tests) are captured (we currently use Twist) in a test like format and they are then augmented to make them into automated tests. Nothing unusual in that, lots of test teams create automated tests, however here the automated tests are being executed by the development team not the test team and the tests exist before the development team have even created the code for the software the team is delivering. The real beauty of this method is that the development team can then integrate the automated tests into a Continuous Integration environment, where the newly checked in code is built and tested automatically. So the QA teams test suite is run against the code every time a change gets checked in. But wait, that’s not all. The delivery of code into QA can not happen until it has passed all of our tests, which means that we have built the quality in from the very start. Let me state that again, in case you missed it. The development team can not deliver code into QA until it has passed the QA teams tests. That’s a statement that should raise a lot of internal debate with any passionate tester who hasn’t worked in this way before, and it did with us. Its worth me making the distinction here that Test Driven Development (TDD) is not a testing methodology it’s a design methodology. Being testers we have a very different view of the world to a developer, its in very our nature. Working this way felt like we had harnessed our power and put it where it belongs, under the smelting pot of code.
I should also mention here that the developers were also working in a new way, because they worked in pairs (pair programming) and also used TDD at unit level. This means that a developer has to write a unit test before he writes the code for the unit being developed. That means then, that before its delivered to QA, our code has been tested twice. That’s two times tested.
So if development is running the QA teams tests, and the QA team are writing the automated tests (and its blurry line between coding and writing an automated test), and the code is tested before its delivered to QA, a whole load of question begin to bubble up.
Some of early questions we had:
- Who tests the testers’ tests?
- This is agile, so what happens if the requirements change?
- So what do QA test if it’s already passed the QA tests?
On a typical PRINCE(2)/waterfall project the testing team plans as far in advance as it can. We normally try to follow the IEEE 829-1983 standard and document as much as we can.
The documentation covers how we approach the testing and detail the testing activities. These documents are usually created in isolation from the project team and may be published for approval. However in my experience the documents are rarely scrutinised and any feedback given only serves to pay lip service to the processes. Another checkbox checked, and another Gant chart milestone met.
Working in an agile testing environment also has a requirement to define which tools and methods will be used for writing, executing, and reporting tests and determine the best approach to testing, and the scope of that testing. The big difference is that the whole team is engaged in this deterministic process and we found that it was important to engage the developers in this definition, because they would be executing our tests and writing their own unit tests. Moreover thought had to be given to automating the regression testing, something that would happen as part of the continuous integration process.
The business stakeholders were also involved in this process (unfortunately only by proxy through the business analysts, as they are physically located in a different part of the country) as they would help to define and run the acceptance tests. In agile, we (the whole team) all test, but the business accepts.
In short within agile practices, everyone has a contributory part in defining, upholding, and improving the quality of the product.
One of the gotchas was that we found that we needed to become more technical. We had thought that we were already more technically skilled than your average tester, however as we endeavoured to automate our testing we found that we had to skill up and learn not just how to write code (Java for us), but compile the code and version control it. They were steep, steep learning curves, which now we have overcome them, have empowered us, giving us the tools for a brighter, faster and more accurate future. Having said that, it does appear that some of this coding effort could be considered a one off setup task, because we now have a framework of “tools” that cover all of the tasks we need to help us execute an automated test. Couldn’t we have asked the development team to do the tech tasks for us? We did, and they didn’t have any resource free to accommodate our needs as well as those of the agile project. Regardless, we have skills now that we have been able to share with the wider QA team, and they too are seeing the fruits of our initial labour.
Traditional Tools Solve Traditional Problems in Traditional Contexts. Agile Is Not Traditional.
Traditional, heavyweight, record-and-playback tools (like Quality Center) address the challenges faced by teams operating in a traditional context with specialisms. They address the challenge of having non-programmers automate tests by having record-and-playback features, a simplified editing environment, and a simplified programming language.
But Agile teams don’t need tools like these (optimised for non-programmers). What Agile test teams need are tools to solve an entirely different set of challenges that are related to collaborating, communicating, reducing the waste (Muda), and decrease the feedback loop. Ergo, traditional (long standing) test automation tools just don’t cut the mustard in an Agile context because they are designed to solve traditional problems, in traditional contexts and those really are quite different to the challenges faced by Agile test teams. To make it clear, QC & TD isn't going to cut it in Agile.
At the Google Tech Talks December 9, 2005 Elisabeth Hendrickson gave a talk on how as more teams are adopting Agile practices such as XP and Scrum, software testing teams are being asked to become "Agile" as well... View it here