9eb1907151
* added file that contains information on the minimal versions needed for the supported loggers * copied minimal version, combined files, deleted duplicates * sorted functions in tests/test_loggers.py to be consistent * expanded wandb logging test; added minimal versions for requirements-extra.txt; increased the amount of training data that is used for tests * formatting * added requirements-extra.txt to MANIFEST.in * reverted wandb test; ensured minimal version for dependencies in requirements-extra.txt in ci-testing.yml |
||
---|---|---|
.. | ||
models | ||
README.md | ||
__init__.py | ||
conftest.py | ||
requirements.txt | ||
test_amp.py | ||
test_cpu_models.py | ||
test_gpu_models.py | ||
test_logging.py | ||
test_profiler.py | ||
test_restore_models.py | ||
test_trainer.py |
README.md
PyTorch-Lightning Tests
Most PL tests train a full MNIST model under various trainer conditions (ddp, ddp2+amp, etc...). This provides testing for most combinations of important settings. The tests expect the model to perform to a reasonable degree of testing accuracy to pass.
Running tests
The automatic travis tests ONLY run CPU-based tests. Although these cover most of the use cases, run on a 2-GPU machine to validate the full test-suite.
To run all tests do the following:
git clone https://github.com/PyTorchLightning/pytorch-lightning
cd pytorch-lightning
# install module locally
pip install -e .
# install dev deps
pip install -r requirements.txt
# run tests
py.test -v
To test models that require GPU make sure to run the above command on a GPU machine. The GPU machine must have:
- At least 2 GPUs.
- NVIDIA-apex installed.
Running Coverage
Make sure to run coverage on a GPU machine with at least 2 GPUs and NVIDIA apex installed.
cd pytorch-lightning
# generate coverage
pip install coverage
coverage run --source pytorch_lightning -m py.test pytorch_lightning tests examples -v --doctest-modules
# print coverage stats
coverage report -m
# exporting resulys
coverage xml
codecov -t 17327163-8cca-4a5d-86c8-ca5f2ef700bc -v