lightning/tests/plugins/test_ddp_plugin.py

230 lines
7.0 KiB
Python
Raw Normal View History

import os
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
import platform
from unittest import mock
import pytest
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
from pytorch_lightning import Trainer
from pytorch_lightning.callbacks import Callback
from pytorch_lightning.plugins.ddp_plugin import DDPPlugin
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
from pytorch_lightning.plugins.sharded_plugin import DDPShardedPlugin
from pytorch_lightning.utilities import FAIRSCALE_AVAILABLE
from pytorch_lightning.utilities.exceptions import MisconfigurationException
from tests.base.boring_model import BoringModel
@mock.patch.dict(
os.environ,
{
"CUDA_VISIBLE_DEVICES": "0,1",
"SLURM_NTASKS": "2",
"SLURM_JOB_NAME": "SOME_NAME",
"SLURM_NODEID": "0",
"LOCAL_RANK": "0",
"SLURM_LOCALID": "0",
},
)
@mock.patch("torch.cuda.device_count", return_value=2)
@pytest.mark.parametrize(
["ddp_backend", "gpus", "num_processes"],
[("ddp_cpu", None, None), ("ddp", 2, 0), ("ddp2", 2, 0), ("ddp_spawn", 2, 0)],
)
def test_ddp_choice_default_ddp_cpu(tmpdir, ddp_backend, gpus, num_processes):
class CB(Callback):
def on_fit_start(self, trainer, pl_module):
assert isinstance(trainer.accelerator_backend.ddp_plugin, DDPPlugin)
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
raise RuntimeError('finished plugin check')
model = BoringModel()
trainer = Trainer(
fast_dev_run=True,
gpus=gpus,
num_processes=num_processes,
distributed_backend=ddp_backend,
callbacks=[CB()],
)
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
with pytest.raises(RuntimeError, match='finished plugin check'):
trainer.fit(model)
@mock.patch.dict(
os.environ,
{
"CUDA_VISIBLE_DEVICES": "0,1",
"SLURM_NTASKS": "2",
"SLURM_JOB_NAME": "SOME_NAME",
"SLURM_NODEID": "0",
"LOCAL_RANK": "0",
"SLURM_LOCALID": "0",
},
)
@mock.patch("torch.cuda.device_count", return_value=2)
@pytest.mark.parametrize(
["ddp_backend", "gpus", "num_processes"],
[("ddp_cpu", None, None), ("ddp", 2, 0), ("ddp2", 2, 0), ("ddp_spawn", 2, 0)],
)
def test_ddp_choice_custom_ddp_cpu(tmpdir, ddp_backend, gpus, num_processes):
class MyDDP(DDPPlugin):
pass
class CB(Callback):
def on_fit_start(self, trainer, pl_module):
assert isinstance(trainer.accelerator_backend.ddp_plugin, MyDDP)
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
raise RuntimeError('finished plugin check')
model = BoringModel()
trainer = Trainer(
fast_dev_run=True,
gpus=gpus,
num_processes=num_processes,
distributed_backend=ddp_backend,
plugins=[MyDDP()],
callbacks=[CB()],
)
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
with pytest.raises(RuntimeError, match='finished plugin check'):
trainer.fit(model)
@mock.patch.dict(
os.environ,
{
"CUDA_VISIBLE_DEVICES": "0,1",
"SLURM_NTASKS": "2",
"SLURM_JOB_NAME": "SOME_NAME",
"SLURM_NODEID": "0",
"LOCAL_RANK": "0",
"SLURM_LOCALID": "0",
},
)
@mock.patch("torch.cuda.device_count", return_value=2)
@pytest.mark.parametrize(
["ddp_backend", "gpus", "num_processes"],
[("ddp_cpu", None, None), ("ddp", 2, 0), ("ddp2", 2, 0), ("ddp_spawn", 2, 0)],
)
@pytest.mark.skipif(platform.system() == "Windows", reason="Distributed sharded plugin is not supported on Windows")
@pytest.mark.skipif(not FAIRSCALE_AVAILABLE, reason="Fairscale is not available")
def test_ddp_choice_string_ddp_cpu(tmpdir, ddp_backend, gpus, num_processes):
class CB(Callback):
def on_fit_start(self, trainer, pl_module):
assert isinstance(trainer.accelerator_backend.ddp_plugin, DDPShardedPlugin)
raise RuntimeError('finished plugin check')
model = BoringModel()
trainer = Trainer(
fast_dev_run=True,
gpus=gpus,
num_processes=num_processes,
distributed_backend=ddp_backend,
plugins='ddp_sharded',
callbacks=[CB()],
)
with pytest.raises(RuntimeError, match='finished plugin check'):
trainer.fit(model)
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
@mock.patch.dict(
os.environ,
{
"CUDA_VISIBLE_DEVICES": "0,1",
"SLURM_NTASKS": "2",
"SLURM_JOB_NAME": "SOME_NAME",
"SLURM_NODEID": "0",
"LOCAL_RANK": "0",
"SLURM_LOCALID": "0",
},
)
@mock.patch("torch.cuda.device_count", return_value=2)
@pytest.mark.parametrize(
["ddp_backend", "gpus", "num_processes"],
[("ddp_cpu", None, None), ("ddp", 2, 0), ("ddp2", 2, 0), ("ddp_spawn", 2, 0)],
)
def test_ddp_invalid_choice_string_ddp_cpu(tmpdir, ddp_backend, gpus, num_processes):
with pytest.raises(MisconfigurationException, match='not a supported lightning custom plugin'):
Trainer(
fast_dev_run=True,
gpus=gpus,
num_processes=num_processes,
distributed_backend=ddp_backend,
plugins='invalid',
)
@mock.patch.dict(
os.environ,
{
"CUDA_VISIBLE_DEVICES": "0,1",
"SLURM_NTASKS": "2",
"SLURM_JOB_NAME": "SOME_NAME",
"SLURM_NODEID": "0",
"LOCAL_RANK": "0",
"SLURM_LOCALID": "0",
},
)
@mock.patch("torch.cuda.device_count", return_value=2)
@pytest.mark.parametrize(
["ddp_backend", "gpus", "num_processes"],
[("ddp_cpu", None, None), ("ddp", 2, 0), ("ddp2", 2, 0), ("ddp_spawn", 2, 0)],
)
@pytest.mark.skipif(platform.system() == "Windows", reason="Distributed sharded plugin is not supported on Windows")
@pytest.mark.skipif(not FAIRSCALE_AVAILABLE, reason="Fairscale is not available")
def test_ddp_invalid_choice_string_and_custom_ddp_cpu(tmpdir, ddp_backend, gpus, num_processes):
"""
Test passing a lightning custom ddp plugin and a default ddp plugin throws an error.
"""
class MyDDP(DDPPlugin):
pass
with pytest.raises(MisconfigurationException, match='you can only use one DDP plugin in plugins'):
Trainer(
fast_dev_run=True,
gpus=gpus,
num_processes=num_processes,
distributed_backend=ddp_backend,
plugins=['ddp_sharded', MyDDP()],
)
@mock.patch.dict(
os.environ,
{
"CUDA_VISIBLE_DEVICES": "0,1",
"SLURM_NTASKS": "2",
"SLURM_JOB_NAME": "SOME_NAME",
"SLURM_NODEID": "0",
"LOCAL_RANK": "0",
"SLURM_LOCALID": "0",
},
)
@mock.patch("torch.cuda.device_count", return_value=2)
@pytest.mark.parametrize(
["ddp_backend", "gpus", "num_processes"],
[("ddp_cpu", None, None), ("ddp", 2, 0), ("ddp2", 2, 0), ("ddp_spawn", 2, 0)],
)
def test_ddp_choice_custom_ddp_cpu_custom_args(
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
tmpdir, ddp_backend, gpus, num_processes
):
class MyDDP(DDPPlugin):
pass
class CB(Callback):
def on_fit_start(self, trainer, pl_module):
assert isinstance(trainer.accelerator_backend.ddp_plugin, MyDDP)
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
raise RuntimeError('finished plugin check')
model = BoringModel()
trainer = Trainer(
fast_dev_run=True,
gpus=gpus,
num_processes=num_processes,
distributed_backend=ddp_backend,
plugins=[MyDDP(broadcast_buffers=False, find_unused_parameters=True)],
callbacks=[CB()],
)
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
with pytest.raises(RuntimeError, match='finished plugin check'):
trainer.fit(model)