Examine test motivators and test items
Purpose:
|
To consider the influence of the mission, test motivators and the test items on the approach
for the forthcoming test effort.
|
Using the evaluation mission as context, examine the iteration Test Plan and study the test motivators that
have been identified for the forthcoming test effort. It may be necessary to do further investigation at
the Motivator source - usually the iteration plan provides a means of locating additional information.
For each Motivator, consider what test approach and associated techniques might be required to address each
Motivator. Also examine the iteration Test Plan and study the test items. Each targeted test item should be
considered in relation to each Motivator, and the approach and techniques extended accordingly. If you
cannot find a lot of detail about, or you are unfamiliar with the test items, it may be useful to discuss
the targeted items with the development staff, usually by starting with the software architect or
development team leads.
Focus on identifying the minimal set of techniques necessary to satisfactorily address the evaluation
mission and motivators. Look for opportunities where one technique can be used to address more than one
aspect of the required testing. Note other potential techniques that seem interesting to explore, but be
able to identify these as additional rather than essential.
|
Examine the software architecture
Purpose:
|
To consider the influence of the software architecture on the test approach.
|
Study the Software Architecture to gain an understanding of it's key elements-mechanisms, main views and so
forth. Typically the Software Architecture Document provides good information focused at the right level of
detail for use in considering a test approach. To clarify it's information, or in the absence of a
document, it is useful to discuss the architecture with the development staff, usually by talking to the
software architect directly, or one of the development team leads.
Focus on identifying and discussing the key mechanisms, and gaining a good understanding of these aspects
of the system. Each mechanism and key feature of the architecture will likely present challenges or
constraints for the test effort. For example, a distributed architecture may necessitate organizing the
test team into sub-teams, each team targeting an architectural tier.
While a creative way to the test implementation & execution strategy can
often be used to overcome these challenges, it may be necessary to have the
development team modify the software to enable testing as discussed in Task: Define Testability Elements.
|
Consider the appropriate breadth and depth of the test approach
Purpose:
|
To consider the completeness of the test approach both in terms of breadth and depth.
|
Considering all the details that are now known about the requirements on the
test approach, it is beneficial to step back and consider the test approach
from a higher-level perspective. What things does the test approach not address
that it should? Are there any concerns that should be explored that don't appear
in any of the documented information?
Based on your experience, review the requirements for the test approach for appropriate breadth and depth
for this stage in the project lifecycle. Consider additional requirements that will help to present a more
complete approach.
|
Identify existing test techniques for reuse
Purpose:
|
To reuse or adapt from existing proven test techniques, where appropriate.
|
From your own experience, or other experience you have access to, identify existing techniques that will
either meet the requirements of the test approach, or can be adapted to meet them.
|
Identify additional techniques
Purpose:
|
To identify the techniques required to provide a comprehensive and sufficient test
approach.
|
It's not terribly useful to think in terms of a "complete" test approach-there are always additional
techniques you might try if you only had limitless time and resource.
However, it is important that the test approach is well-rounded and comprehensive enough to allow a useful
evaluation of perceived quality to be made. This requires an approach that evaluates sufficient aspects of
quality risk or dimensions of quality for the project team to assess perceived quality with a justified
degree of confidence.
|
Define techniques
Purpose:
|
To outline the workings of each technique, including the objective of the testing it
supports.
|
Outline the workings of each technique. Address the type of testing it supports, the objective and scope,
implementation method, test oracles, assessment method and automation needs of the technique.
In many cases you'll reuse technique from one project to the next. In this situation you can simply
reference a common definition of the technique or copy the existing definition and revise as appropriate.
For each existing or required technique:
Many techniques will support more than one type of testing, so give some thought to identifying which tests
the technique will need to support. This helps to identify the scope of the effort required if the
technique is being defined for the first time.
Give thought to the underlying objective and value this technique represents.
Define how the technique will be implemented. It's not good enough to simply state "We're doing system
performance testing"-you need to give serious thought to how that can be achieved.
Some techniques you would like to use will be uneconomic to pursue. By describing briefly how you will
approach implementing this technique you'll be able to get a overall sense of the logistics involved and
the practicalities of pursuing the technique further.
Determine how you will observe and evaluate the results of each test implemented using this technique.
Give thought to the different Test Oracles that are available for you to use-is there a single
oracle, or are their different ways that you can determine the result of each test?
Automation can play an important role in many test techniques. In some cases it will be less sophisticated,
simply providing support for conducting manual tests.
Give some thought to how the work involving the technique could be most efficiently implemented, maintained
and managed. Be open minded-think both broad and deep, considering as many options as possible.
Identify the appropriate tools to use with this test technique. Use the work from the previous step that
identified uses of automation.
Remember to consider a broad range of tool categories; your list of candidate tools should include more
than just test execution automation tools. In addition tools that automate test execution, consider tools
that will enhance the productivity of the test team by reducing repetitive, laborious tasks, such as Test
Data management, Test Results analysis, incident and Change Request reporting tools, etc.
|
Outline the Test Automation Architecture
Purpose:
|
To define a candidate architecture for the test automation system.
|
Based on experience gained from similar systems or in similar problem domains, begin to define a candidate
architecture for the test automation system.
We recommend you review the information at the following link to help you with this task: Activity: Define a Candidate Architecture.
|
Define the test asset configuration management strategy
Purpose:
|
To consider what requirements test will have for configuration management.
|
Like many other work products produced during a software development project, test assets are candidates for
configuration management and version control.
The specific requirements can range in complexity from the decision to use basic backup and recovery
services enabled, to having full-support for parallel development of automated Test Scripts at multiple
sites against different versions of an application.
Give thought to your requirements for configuration management, and begin to outline probable logistic
needs to realize those requirements.
|
Survey availability of reusable assets
Purpose:
|
To reduce risk and effort by reusing existing proven assets.
|
Sometimes it makes sense to build assets from scratch, and sometimes it doesn't. Try to find a good balance
between a complete "roll-you-own" philosophy and establishing a rigid and bureaucratic librarian policy on
new work product creation.
There are times when one approach is better than the other, and you should be flexible enough to take
advantage of the benefits that both approaches bring.
|
Capture your findings
Purpose:
|
To record the important information about the test approach.
|
Depending on a number of factors including team size and organization culture, there will be better and
worse ways to record the decisions you've made about the test approach.
You will typically have two audiences to consider: the management team will want to review this information
to provide approval and be aware of logistics implications of the approach, and the test team will want to
use the test approach as guidance for the work the undertake. Try to find an appropriate medium to suitably
address both needs: perhaps using a project Intranet web-site.
|
Evaluate and verify your results
Purpose:
|
To verify that the task has been completed appropriately and that the resulting work products
are acceptable.
|
Now that you have completed the work, it is beneficial to verify that the work was of sufficient value, and
that you did not simply consume vast quantities of paper. You should evaluate whether your work is of
appropriate quality, and that it is complete enough to be useful to those team members who will make
subsequent use of it as input to their work. Where possible, use the checklists provided in RUP to verify
that quality and completeness are "good enough".
Have the people performing the downstream tasks that rely on your work as input take part in reviewing
your interim work. Do this while you still have time available to take action to address their concerns.
You should also evaluate your work against the key input work products to make sure you have represented them
accurately and sufficiently. It may be useful to have the author of the input work product review your work on
this basis.
Try to remember that that RUP is an iterative delivery process and that in many cases work products evolve over time. As
such, it is not usually necessary-and is often counterproductive-to fully-form a work product that will only
be partially used or will not be used at all in immediately subsequent work. This is because there is a
high probability that the situation surrounding the work product will change-and the assumptions made when the
work product was created proven incorrect-before the work product is used, resulting in wasted effort and costly
rework. Also avoid the trap of spending too many cycles on presentation to the detriment of content value.
In project environments where presentation has importance and economic value as a project deliverable, you
might want to consider using an administrative resource to perform presentation tasks.
|
|