6. Understand the Task at Hand and the Related Testing Goal. 7. Consider the Risks. 8. Base Testing Efforts on a Prioritized Feature Schedule. 9. Keep Software Issues in Mind. 10. Acquire Effective Test Data. 11. Plan for the Test Environment. 12. Estimate Test Preparation and Execution.
III. The Testing Team 13. Define the Roles and Responsibilities. 14. Require a Mixture of Testing Skills, Subject Matter Expertise, and Experience. 15. Evaluate the Testers' Effectiveness. IV. The System Architecture 16. Understand the Architecture and Underlying Components. 17. Verify That the System Supports Testability. 18. Use Logging to Increase System Testability. 19. Verify That the System Supports Debug vs. Release Execution Modes.
V. Test Design And Documentation 20. Divide and Conquer. 21. Mandate the Use of a Test Procedure Template, and Other Test Design Standards. 22. Derive Effective Test Cases from Requirements. 23. Treat Test Procedures as "Living" Documents. 24. Use System Design and Prototypes. 25. Use Proven Testing Techniques When Designing Test Case Scenarios. 26. Avoid Constraints and Detailed Data Elements in Test Procedures. 27. Apply Exploratory Testing.
VI. Unit Testing 28. Structure the Development Approach to Support Effective Unit Testing. 29. Develop Unit Tests in Parallel or before the Implementation. 30. Make Unit Test Execution Part of the Build Process.
VII. Automated Testing Tools 31. Be Aware of the Different Types of Testing Support Tools. 32. Consider Building a Tool Instead of Buying One. 33. Be Aware of the Impact of Automated Tools on the Testing Effort. 34. Focus on the Needs of Your Organization. 35. Test the Tools on an Application Prototype.
VIII. Automated Testing—Selected Best Practices 36. Do Not Rely Solely on Capture/Playback. 37. Develop a Test Harness When Necessary. 38. Use Proven Test Script Development Techniques. 39. Automate Regression Tests Whenever Possible. 40. Implement Automated Builds and Smoke-Tests.
IX. Nonfunctional Testing 41. Do Not Make Nonfunctional Testing an Afterthought. 42. Conduct Performance Testing with Production Sized Databases. 43. Tailor Usability Tests to the Intended Audience. 44. Consider All Aspects of Security, for Specific Requirements and System-Wide. 45. Investigate the System's Implementation to Plan for Concurrency Tests. 46. Setup an Efficient Environment for Compatibility Testing.
X. Managing The Test Execution 47. Clearly Define the Beginning and the End of the Test Execution Cycle. 48. Isolate the Test Environment from the Development Environment. 49. Implement a Defect Tracking Life-Cycle. 50. Track the Execution of the Test Program.
এ্যাডভান্স পাবলি্কেশন্স
Title :
Advanced Learners Degree General English (With Soloution)