Thursday, July 3, 2008

Testing in the open source world. Part 1

Intro
Today I would like to begin a several part series about quality assurance testing and how it can work with open source software and projects. There are several stages to be aware of and many techniques to accomplish a high level of quality and code coverage. Although I am familiar with many of these, in an effort to keep things as simple as possible, I will only talk about the most popular aspects of each stage of testing. Enjoy.

How To Begin - a.k.a Requirements
Software comes in many shapes and sizes, and each with it's own varying amount of documentation. I have seen notepad applications with each feature carefully explained, and huge supply chain strategy suites with no documentation at all. Our hope, as testers, is to always land somewhere in the middle of these two. Some documentation is definitely preferred, but even when it is lacking, through some work we can get what we need to begin creating a structured test plan.

So, lets assume for now that we have some documentation. In the open source world this documentation is usually in the form of a FAQ or Release note from the developer. For our purposes here, lets say you are interested in contributing some testing to Vuze (Azeurus). It was the number one download on Sourceforge the day this article was written and it has a rather large FAQ to explain basic functionality found here.

Planning
We now have a requirements document (FAQ) and can begin designing our test objectives. Depending on the scope of what we want to accomplish, we can plan to test all the functionality described, or focus on a single area of functionality. For our example, Vuze, lets pick a small subset of features to test. We will focus on the Content Search features described here.

The first document we will create is a Test Plan. This is a distillation of the project/application requirements. We want to create a document that we can easily reference to ensure that we have not missed a requirements during testing. There are several formats for this, no one is better or worse than any other, so pick a format you are comfortable with. I prefer to have a collaborative testing effort, I use wiki documents for this, so my examples will be based on that technology.

So lets pull some requirements out. Go here and copy the content to your requirements document. Now format them into numbered items under a heading like "1.0 - Discovering Content". You can see my final example requirement here.

It is also very important to reference the source material you used for each section of your plan. This allows the reader to move between the documents easily and for you to quickly track changes in requirements and make necessary updates to all related documents.

Now just repeat these steps with as many sections of the FAQ that you would like to test. Be sure to keep incrementing the sections so you so not repeat the header. So "What's new about the search box" would be "2.0 - Search box"...etc.

Once we have created a large number of indexed requirements we can then begin testing each of the functions the developer has documented. Without a good set of requirements the test effort can be sidetracked or unfocussed. Having a clear path for the progression of the test effort will keep the testing manageble and some even think...fun.

Closing
Next time we will move forward to the next step...creating cases from your requirements. Until then, take a look at the following links for additional examples of test requirements. Happy hunting.

Chris

Additional Links


Requirements for UME/Intel


Template for Requirements
as found in Proprietary Software world (Glad we are not there! :-P )