This repository was archived by the owner on Jun 3, 2020. It is now read-only.
- Notifications
You must be signed in to change notification settings - Fork252
PostgreSQL_Test_Plan
Grant Gainey edited this pageDec 12, 2016 ·2 revisions
The objective is to describe the Test plan of Space Walk - during the migration of the Database from the Oracle to Postgres as well as testing the application to perform against both databases for predefined performance benchmarks and all functionality.
- Functional Testing
- Query/Stored Procedure Unit Tests
- System Testing
- Performance Test - queries
- Data Migration - Oracle -> Postgres, Postgres -> Oracle
- API Testing.
- Upgrade Testing
- Scale/Concurrency Testing (eg: large numbers of web requests at once)
- Application Knowledge Transfer.
- Environment Identification and Setup
- Access to Scripts and Knowledge Transfer - the scripts referred here are the available selenium scripts.
- Identify all the gaps in the existing scripts
- Priorterize the order of the modules.
- Establish a set of benchmarks for performance on Oracle.
- Communication & Reporting plan
- Plan to execute and Test modules - along with sign-offs
- UAT.
- Establish as set of scale benchmarks for performance on Oracle.
- The project wiki - will be the primary communication area - regarding the tasks, dates and tracking progress.
- Conference calls - twice a week - to measure progress.
- Daily and coverage during the overlap of Team - will be available on the irc channel as well as other modes identified.
- Identify the tool to report and Track issues to closure.
- Identify the priority and severity.
- The process for the life cycle of a issue - report - plan - resolve - test – close
- Query/Stored Procedures
- Identify and document the user acceptance criteria for the application.
- Includes the sign-off criteria.
- Spend 1 day - Teaching the Test resources the spacewalk application functionality.
Initial estimate is one day for a walk through and then as we identify the priority of the modules - we will go deep into the functionality and evolve the knowledge -- as we build the test cases and core team members review them.
- Identify Environment Setup:1. Identified as a 4 machine setup.2. Get further details on hardware and software requirements.
- Acquisition of the Environment.
- Setup the environment - a 2 day task.
- Setup access to RedHat to the Environment.
- Identify all the different modules to Test.
- Create the priority Order
- Create functional Use Cases and Test Cases around the Test Cases.1. Who will do this? - Ideally the Testing Team to develop2. Without a complete understanding of the application, how will EDB do this? -- This is a way to force knowledge of the application -- will track the quality of the cases by constant review from the core Team3. Will these be documented using RH test case templates? - definitely we can use the existing Test Template4. Are the test cases to be created for ALL functionality or just those functions not already documented by RH test cases? - we will build upon the existing base.
- Review -> update -> review -> sign off cycle for each module.
- The Modules identified to Test are:1. Web-UI2. API3. Release Engineering4. E-mail regression5. Sanity6. Proxy7. Monitoring8. Quick Search9. Advanced Search10. Channels11. Errata12. Auto Errata Updates13. Errata Search14. Configuration Management15. Pagination16. RHN Registration17. SDC Software18. Activation Key19. Reactivation Key20. Multi Org - RHN21. Multi Org - II22. Authentication23. Virtualization24. Kickstart25. Solaris26. Users27. SSM28. Satellite Sync & Export
- Get access to existing automated test scripts.
- Setup and run the automated test scripts.
- Knowledge Transfer on the existing Test scripts.
- Start building on the existing Test Scripts.
We need to account for manual testing using client side parts of the application. For example: Have 5000+ systems register or check-in and receive some number of updates (rpms).
- Get a sample data set.
- Identify use cases for performance testing.
- Identify the parameters of performance testing.
- Create a baseline benchmark with the existing application.
- Run the same and identify and fix - on the migrated environment.
- Identify test cases for scale and DB concurrency testing.
- Assumption - Only migration from the latest release version to the with Oracle to Postgres and Postgres to Oracle.
- Create a data set for migration.
- Migrate to Postgres.
- Run regression of the whole application.
- Migrate to Oracle from Postgres.
- Run regression of the whole application.
- Schema upgrade testing.
- Run a complete regression of the application.
- Perform performance and generate a score sheet.
- Performance scale/concurrency testing and generate a score sheet.
- Review and get sign-offs. This is a 2 phase:1. Module by module sign-offs2. Entire application sign-off.
Do you want to contribute to this wiki? See pageWikiContribute for more info.