Wednesday, October 10, 2007

PRACTICAL EXP ON AUTOMATION TESTING.

My perspective on most things is that the 'glass is half full' rather than half empty. This attitude carries over to the advice I suggest on automated software testing as well. I should point out, however, there is an increasing awareness from others experienced in this field, as well as from my own experience, that many efforts in test automation do not live up to expectations. A lot of effort goes into developing and maintaining test automation, and even once it's built you may or may not recoup your investment. It's very important to perform a good cost/benefit analysis on whatever manual testing you plan to automate. The successes I've seen have mostly been on focused areas of the application where it made sense to automate, rather than complete automation efforts. Also, skilled people were involved in these efforts and they were allowed the time to do it right.
Test automation can add a lot of complexity and cost to a test team's effort, but it can also provide some valuable assistance if its done by the right people, with the right environment and done where it makes sense to do so. I hope by sharing some pointers that I feel are important that you'll find some value that translates into saved time, money and less frustration in your efforts to implement test automation back on the job.
Key Points

I’ve listed the ‘key points’ up front instead of waiting until the end. The rest of the article will add detail to some of these key points.
* First, it's important to define the purpose of taking on a test automation effort. There are several categories of testing tools each with its own purpose. Identifying what you want to automate and where in the testing life cycle will be the first step in developing a test automation strategy. Just wishing that everything should be tested faster is not a practical strategy. You need to be specific.
* Developing a test automation strategy is very important in mapping out what's to be automated, how it's going to be done, how the scripts will be maintained and what the expected costs and benefits will be. Just like every testing effort should have a testing strategy, or test plan, so should there be a 'plan' built for test automation.
* Many of the testing 'tools' provided by vendors are very sophisticated and use existing or proprietary coding 'languages'. The effort of automating an existing manual testing effort is no different than a programmer using a coding language to write programs to automate any other manual process. Treat the entire process of automating testing as you would any other software development effort. This includes defining what should be automated, (the requirements phase), designing test automation, writing the scripts, testing the scripts, etc. The scripts need to be maintained over the life of the product just as any program would require maintenance. Other components of software development, such as configuration management also apply
* The effort of test automation is an investment. More time and resources are needed up front in order to obtain the benefits later on. Sure, some scripts can be created which will provide immediate payoff, but these opportunities are usually small in number relative to the effort of automating most test cases. What this implies is that there usually is not a positive payoff for automating the current release of the application. The benefit comes from running these automated tests every subsequent release. Therefore, ensuring that the scripts can be easily maintained becomes very important.
* Since test automation really is another software development effort, it's important that those performing the work have the correct skill sets. A good tester does not necessarily make a good test automator. In fact, the job requirements are quite different. Good testers are still necessary to identify and write test cases for what needs to be tested. A test automator, on the other hand, takes these test cases and writes code to automate the process of executing those tests. From what I've seen, the best test automation efforts have been lead by developers who have put their energies into test automation. That's not to say that testers can't learn to be test automators and be successful, it's just that those two roles are different and the skill sets are different.
Points

Here are some other important points to consider:

When strategizing for test automation, plan to achieve small successes and grow. It's better to incur a small investment and see what the effort really takes before going gung ho and trying to automate the whole regression suite. This also gives those doing the work the opportunity to try things, make mistakes and design even better approaches.

Many software development efforts are underestimated, sometimes grossly underestimated. This applies to test automation as well, especially if the effort is not looked upon as software development. Test automation is not something that can be done on the side and care should be taken when estimating the amount of effort involved. Again, by starting small and growing, estimating the work can be gauged.

When people think of testing tools, many first think of the 'capture/playback' variety where the application is tested at the end during system test. There are several types of testing tools which can be applied at various points of code integration. Test automation can be applied at each of the levels of testing including unit testing, one or more layers of integration testing, and system testing (another form of integration). The sooner tests can be executed after the code is written, before too much code integration has occurred, the more likely bugs will not be carried forward. When strategizing for test automation, consider automating these tests as early as possible as well as later in the testing life cycle.

Related to this last point is the idea that testers and software developers need to work as a team to make effective test automation work. I don't believe testing independence is lost when testers and developers work together, but there can be some excellent advantages that I'll later point out.

Testing tools, as sophisticated as they have become, are still dependent upon consistency in the test environment. This should be quite obvious, but having a dedicated test environment is absolutely necessary. If testers don't have control of their test environment and test data, the required setup for tests may not meet the requirements of those tests. When manual testing is done testers may sometimes 'work around' test setup issues. Automated test scripts are less flexible and require specific setup scenarios, thereby needing more control.

Test automation is not the only answer to delivering quality software. In fact, test automation in many cases is a last gasp effort in an attempt to find problems after they've been made instead of eliminating the problems as they are being created. Test automation is not a substitute for walkthroughs, inspections, good project management, coding standards, good configuration management, etc. Most of these efforts produce higher pay back for the investment than does test automation. Testing will always need to be done and test automation can assist, but it should not be looked upon as the primary activity in producing better software.

The truth is that developers can produce code faster and faster with more complexity than ever before. Advancements in code generation tools and code reuse are making it difficult for testers to keep up with software development. Test automation, especially if applied only at the end of the testing cycle, will not be able to keep up with these advances. We must pull out all stops along the development life cycle to build in good quality software and test as early and often as possible with the assistance of test automation.

No comments: