1. Database testing can get complex. It may be worth your while if you create a separate test plan specifically for database testing.
2. Look for database related requirements in your requirements documentation. You should specifically look for requirements related to data migration or database performance. A good source for eliciting database requirements is the database design documents.
3. You should plan for testing both the schema and the data.
4. Limit the scope of your database test. Your obvious focus should be on the important test items from a business point of view. For example, if your application is of a financial nature, data accuracy may be critical. If you application is a heavily used web application, the speed and concurrency of database transactions may be very important.
5. Your test environment should include a copy of the database. You may want to design your tests with a test database of small size. However, you should execute your tests on a test database of realistic size and complexity. Further, changes to the test database should be controlled.
6. The team members designing the database tests should be familiar with SQL and database tools specific to your database technology.
7. I find it productive to jot down the main points to cover in the test plan first. Then, I write the test plan. While writing it, if I remember any point that I would like to cover in the test plan, I just add it to my list. Once I cover all the points in the list, I review the test plan section by section. Then, I review the test plan as a whole and submit it for review to others. Others may come back with comments that I then address in the test plan.
8. It is useful to begin with the common sections of the test plan. However, the test plan should be totally customized for its readers and users. Include and exclude information as appropriate. For example, if your defect management process never changes from project to project, you may want to leave it out of the test plan. If you think that query coding standards are applicable to your project, you may want to include it in the test plan (either in the main plan or as an annexure).
Now, let us create a sample database test plan. Realize that it is only a sample. Do not use it as it is. Add or remove sections as appropriate to your project, company or client. Enter as much detail as you think valuable but no more.
For the purpose of our sample, we will choose a database supporting a POS (point of sale) application. We will call our database MyItemsPriceDatabase.
This is the test plan for testing the MyItemsPriceDatabase. MyItemsPriceDatabase is used in our POS application to provide the current prices of the items. There are other databases used by our application e.g. inventory database but these other databases are out of scope of this test.
The purpose of this test plan is to:
1. Outline the overall test approach
2. Identify the activities required in our database test
3. Define deliverables
We have identified that the following items are critical to the success of the MyItemsPriceDatabase:
1. The accuracy of uploaded price information (for accuracy of financial calculations)
2. Its speed (in order to provide quick checkouts)
3. Small size (given the restricted local hard disk space on the POS workstation)
Due to limitation of time, we will not test the pricing reports run on the database. Further, since it is a single-user database, we will not test database security.
1. Price upload test
Price upload tests will focus on the accuracy with which the new prices are updated in the database. Tests will be designed to compare all prices in the incoming XML with the final prices stored in the database. Only the new prices should change in the database after the upload process. The tests will also measure the time per single price update and compare it with the last benchmark.
2. Speed test
After analyzing the data provided to us from the field, we have identified the following n queries that are used most of the time. We will run the queries individually (10 times each) and compare their mean execution times with the last benchmark. Further, we will also run all the queries concurrently (in sets of 2 and 3 (based on the maximum number of concurrent checkouts)) to find out any locking issues.
3. Size test
Using SQL queries, we will review the application queries and find out the following:
a. Items which are never used (e.g. tables, views, queries (stored procedures, in-line queries and dynamic queries))
b. Duplicate data in any table
c. Excessive field width in any table
The xyz tool will be used to design and execute all database tests. The tests will be executed on the local tester workstations (p no.s in all).
Test Activities and Schedule
1. Review requirements xx/xx/xxxx (start) and xx/xx/xxxx (end)
2. Develop test queries
3. Review test queries
4. Execute size test
5. Execute price upload test
6. Execute speed test
7. Report test results (daily)
8. Submit bug reports and re-test (as required)
9. Submit final test report
1. Test lead: Responsible for creating this test plan, work assignment and review, review of test queries, review and compile test results and review bug reports
2. Tester: Responsible for reviewing requirements, developing and testing test queries, execute tests, prepare individual test results, submit bug reports and re-test
The testers will produce the following deliverables:
1. Test queries
2. Test results (describing the tests run, run time and pass/ fail for each test)
3. Bug reports
The risks to the successful implementation to this test plan and their mitigation is as under:
Name Role Signature Date
Powered by Blogger.
CMM Level 5 companies list List of CMM-5 Certified Software Service Companies in India Listed in no particular order. The purpose of this l...
A formal technical review is a software quality assurance activity performed by software engineers (and others). The objectives of the FT...
The test case design techniques are broadly grouped into two categories: Black box techniques, White box techniques and other techniques tha...
The spiral model, originally proposed by Boehm , is an evolutionary software process model that couples the iterative nature of prototyping ...
The incremental model combines elements of the linear sequential model (applied repetitively) with the iterative philosophy of prototyping. ...
Often, a customer defines a set of general objectives for software but does not identify detailed input, processing, or output requirements...
The spiral model suggests a framework activity that addresses customer communication. The objective of this activity is to elicit pro...
Rapid application development (RAD) is an incremental software development process model that emphasizes an extremely short development cycl...
V-Model: The V-model promotes the idea that the dynamic test stages (on the right hand side of the model) use the documentation identifie...
Severity Value : S1 : Catastrophic Blocking : The use case cannot be completed with any level of workaround. Problem causes data loss, corr...
- A Quick 10-Step Guide (1)
- Black Box Testing (3)
- Bug Life Cycle (2)
- Certifications (3)
- CMM level (2)
- Comparsion (1)
- Configuration Management (3)
- Cookie Testing (1)
- Defect and Failure (1)
- Functional and Non-Functional (1)
- Functional Testing (1)
- Inspection and Walkthrough (1)
- Interview Software Testing (1)
- ISTQB Question Paper Dump (3)
- Load and Stress Testing (1)
- QA and QC (3)
- QA vs QC (1)
- Regression vs Retesting (1)
- RTM (1)
- SDLC (2)
- SDLC Model (5)
- Severity and Priority (1)
- STLC (2)
- Test Cases (4)
- Test Entry and Exit Criteria (1)
- Test Plan (4)
- Types of Testing (2)
- V Model and W Model (1)
- Validation and Verification (2)
- Web Testing (2)
- White Box Testing (1)
- bipin singh
- ▼ March (3)