Once testing begins the developers must create the test data, enter it into the system and write the test plan.
- The VENDOR must ensure that the test plan involving test cases is shared with the client at regular interval for review.
- The VENDOR team (developers) and the CLIENT must mutually agree and define a scripting standard for creating test cases.
- The scripting standard used by the VENDOR must define the degree of detail required in test steps (e.g. weather a click by click level of instruction required OR just the basic action items).
- The VENDOR must break larger tests to smaller ones.
- The VENDOR must ensure that the test data should be of good quality and well structured.
- The VENDOR must ensure that the test cases is in a written format, consistent in language, have a name, summary, description and steps, expected results and set up/clean up information.
- The VENDOR must create a checklist of items to test functionality and each functionality must have a test case.
- The checklist used by the VENDOR must be drawn to test new category of features (in case of system enhancement/modification), or for testing features requiring in-depth validation of huge quantity of data.
- The VENDOR must analyze the quality based on comparing the estimated and actual number of test cases (If actual>estimate would means estimate configuration incorrect and vice versa; thus assessing quality, productivity together).