e3001a0929
* Add tests for distributed backend config * Refactor set_distributed_mode * Use gloo backend on cpu * Use 127.0.0.1 instead of 127.0.0.2 Not totally clear on why this is necessary, but it seemt to work * Update LightningDDP so that it works with CPU * Add ddp_cpu backend and num_processes Trainer arg * PEP8 * Fix test skipping. Inequalities are hard :/ * Skip ddp_cpu test on Windows * Make a few more cases fall back to ddp_cpu * New function name * Flake8 * Don't test distributed on MacOS with torch < 1.3 Support for distributed in MacOS was added in Torch 1.3.0 * Add ddp_cpu and num_processes to docs * Parametrize trainer config tests * Tweak warning Co-Authored-By: Jirka Borovec <Borda@users.noreply.github.com> * Remove redundant test * Replace pass branches with comments * Add missing warnings import * save_path -> root_dir * Use new rank_zero_warn * Whitespace * Apply suggestions from code review * formatting Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: J. Borovec <jirka.borovec@seznam.cz> |
||
---|---|---|
.. | ||
__init__.py | ||
data_parallel.py | ||
override_data_parallel.py |