1. Database testing can get complex. It may be worth your while if you create a separate test plan specifically for database testing.
2. Look for database related requirements in your requirements documentation. You should specifically look for requirements related to data migration or database performance. A good source for eliciting database requirements is the database design documents.
3. You should plan for testing both the schema and the data.
4. Limit the scope of your database test. Your obvious focus should be on the important test items from a business point of view. For example, if your application is of a financial nature, data accuracy may be critical. If you application is a heavily used web application, the speed and concurrency of database transactions may be very important.
5. Your test environment should include a copy of the database. You may want to design your tests with a test database of small size. However, you should execute your tests on a test database of realistic size and complexity. Further, changes to the test database should be controlled.
6. The team members designing the database tests should be familiar with SQL and database tools specific to your database technology.
7. I find it productive to jot down the main points to cover in the test plan first. Then, I write the test plan. While writing it, if I remember any point that I would like to cover in the test plan, I just add it to my list. Once I cover all the points in the list, I review the test plan section by section. Then, I review the test plan as a whole and submit it for review to others. Others may come back with comments that I then address in the test plan.
8. It is useful to begin with the common sections of the test plan. However, the test plan should be totally customized for its readers and users. Include and exclude information as appropriate. For example, if your defect management process never changes from project to project, you may want to leave it out of the test plan. If you think that query coding standards are applicable to your project, you may want to include it in the test plan (either in the main plan or as an annexure).
Now, let us create a sample database test plan. Realize that it is only a sample. Do not use it as it is. Add or remove sections as appropriate to your project, company or client. Enter as much detail as you think valuable but no more.
For the purpose of our sample, we will choose a database supporting a POS (point of sale) application. We will call our database MyItemsPriceDatabase.
Introduction
This is the test plan for testing the MyItemsPriceDatabase. MyItemsPriceDatabase is used in our POS application to provide the current prices of the items. There are other databases used by our application e.g. inventory database but these other databases are out of scope of this test.
The purpose of this test plan is to:
1. Outline the overall test approach
2. Identify the activities required in our database test
3. Define deliverables
Scope
We have identified that the following items are critical to the success of the MyItemsPriceDatabase:
1. The accuracy of uploaded price information (for accuracy of financial calculations)
2. Its speed (in order to provide quick checkouts)
3. Small size (given the restricted local hard disk space on the POS workstation)
Due to limitation of time, we will not test the pricing reports run on the database. Further, since it is a single-user database, we will not test database security.
Test Approach
1. Price upload test
Price upload tests will focus on the accuracy with which the new prices are updated in the database. Tests will be designed to compare all prices in the incoming XML with the final prices stored in the database. Only the new prices should change in the database after the upload process. The tests will also measure the time per single price update and compare it with the last benchmark.
2. Speed test
After analyzing the data provided to us from the field, we have identified the following n queries that are used most of the time. We will run the queries individually (10 times each) and compare their mean execution times with the last benchmark. Further, we will also run all the queries concurrently (in sets of 2 and 3 (based on the maximum number of concurrent checkouts)) to find out any locking issues.
3. Size test
Using SQL queries, we will review the application queries and find out the following:
a. Items which are never used (e.g. tables, views, queries (stored procedures, in-line queries and dynamic queries))
b. Duplicate data in any table
c. Excessive field width in any table
Test Environment
The xyz tool will be used to design and execute all database tests. The tests will be executed on the local tester workstations (p no.s in all).
Test Activities and Schedule
1. Review requirements xx/xx/xxxx (start) and xx/xx/xxxx (end)
2. Develop test queries
3. Review test queries
4. Execute size test
5. Execute price upload test
6. Execute speed test
7. Report test results (daily)
8. Submit bug reports and re-test (as required)
9. Submit final test report
Responsibilities
1. Test lead: Responsible for creating this test plan, work assignment and review, review of test queries, review and compile test results and review bug reports
2. Tester: Responsible for reviewing requirements, developing and testing test queries, execute tests, prepare individual test results, submit bug reports and re-test
Deliverables
The testers will produce the following deliverables:
1. Test queries
2. Test results (describing the tests run, run time and pass/ fail for each test)
3. Bug reports
Risks
The risks to the successful implementation to this test plan and their mitigation is as under:
1.
2.
3.
Approval
Name Role Signature Date
1. ____________________________________________________________
2. ____________________________________________________________
3. ____________________________________________________________
Wednesday, March 30, 2011
Subscribe to:
Posts (Atom)