-
Notifications
You must be signed in to change notification settings - Fork 4
Smoke testing
This covers all the areas that need to be thoroughly checked during the code freeze before release as well as how to test them. All of the below should be done firstly in the valid expected way and then in a malicious way with the intent of trying to break the program. All problems discovered should be created as issues on the repository and then (based on severity) will be addressed either before or after the release.
Before doing any of these, ensure you are using an up-to-date version of the code on master and all the settings files are using production settings (assuming you are on the production node).
- Ensure that all the resources are pulled down correctly from git. The following should be on each node:
- Linux
QueueProcessors/utils/setup.pyrequirements.txt
- Windows (Webapp):
Webapp/setup.pyrequirements.txt
- Windows (Utility machine):
EndOfRunMonitor/utils/setup.pyrequirements.txt
Run all valid unit tests on the node. These will vary depending on what node you are testing, but can be done with:
pytest <name_of_directory>
This should be an easy way to find quick errors in the setup of the project
- Point the end of run monitor to a fake data archive (via the settings files)
- You can create a fake archive using the
DataArchiveCreator
- You can create a fake archive using the
- Re-install the service using
python isis_monitor_win_service.py install - Start the service
- Update the
lastrun.txtfile for a given instrument and ensure that the data message is sent (this can be validated in hawtio) - Repeat this for all valid autoreduction instruments
- Start both of the QueueProcessors on the linux node using the
QueueProcessor/restart.sh - Use
ps aux | grep pythonto validate that both of these services have started. - Use the manual submission script
scripts/manual_submission_script/manual_submission.pyto check that data can be sent through from every instrument. - Check the database to validate that the runs all made it to the database
- Boot up the webapp using Apache see production installation instruction for how to do this.
- Ensure that the webapp is visible from outside of the local environment - e.g. if you go to the URL from another machine
- Test the basic functionality of the webapp:
- Run inspection
- Run resubmission
- All the navigation works
- New runs appear when the database is changed
- Test the webapp admin content is working
- Backup (dump) data first - in case a revert is required due to improper flush
python manage.py dumpdata > db.json
- Check
db.jsoncontains something that looks vaguely sensible - Some data should be preserved (such as user details and static data - instruments/status flags)
- Selectively delete data following tables in the following order:
reduction_variables_runvariable
reduction_variables_instrumentvariable
reduction_variables_variable
reduction_viewer_datalocation
reduction_viewer_reductionlocation
reduction_viewer_reductionrun
reduction_viewer_experiment
- Submit one run per instrument - This is to ensure all the options are valid on the webapp
python scripts/manual_submission/manual_submission <INST> <run_number>
- Once the database changes are validated
db.jsoncan be removed