lightning/pytorch_lightning/plugins/plugin_connector.py

180 lines
6.8 KiB
Python
Raw Normal View History

2020-10-13 11:18:07 +00:00
# Copyright The PyTorch Lightning team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
from enum import Enum
[feat] pp 2/n (#5026) * Added changes for RPC plugin * Add missing kwargs * Fix code format * Loading refactors by introducing is_distributed var, fix optimizer step flow * Add rpc guard * Added docstrings and typing * resolve comments * Add additional rpc hook, refactor name of exit process hook for clarity * remove annotation * Modify behaviour to allow optional return, add test for rpc plugin * resolve tests * rename is_ddp_based * update * update for windows * update * resolve test * code smell * Added sequential plugin * resolve bug * update * cleanup * add Exception * resolve docs * Remove ddp support * Revert distributed -> ddp * Update pl_examples/basic_examples/conv_sequential_example.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pl_examples/basic_examples/conv_sequential_example.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Address code review points * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Add missing return * Fix formatting, add datamodule args * add small comment * resolve comments * resolve comments * update source for fairscale * update extras * remove staticmethod * resolve flake8 * Skip tests that are failing due to bug upstream with multiple optimizers and shard * update * update on comments * clean test * latest comments * remove old comments * add todo * Update version * update * resolve bugs * resolve bugs * update test * remove hanging test * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * resolve on comments * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * resolve on comments * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * remove ImportError Co-authored-by: SeanNaren <sean@grid.ai> Co-authored-by: Sean Naren <sean.narenthiran@gmail.com> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
2020-12-09 12:56:51 +00:00
from typing import List, Optional, Union
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
from pytorch_lightning.cluster_environments import ClusterEnvironment
from pytorch_lightning.plugins.apex import ApexPlugin
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
from pytorch_lightning.plugins.ddp_plugin import DDPPlugin
from pytorch_lightning.plugins.native_amp import NativeAMPPlugin
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
from pytorch_lightning.plugins.plugin import LightningPlugin
from pytorch_lightning.plugins.sharded_plugin import DDPShardedPlugin
from pytorch_lightning.utilities import AMPType, rank_zero_warn
from pytorch_lightning.utilities.exceptions import MisconfigurationException
class PluginConnector:
def __init__(self, trainer):
self.trainer = trainer
self.plugins = []
self.ddp_plugin = DDPPlugin()
self.cloud_environment = None
self.amp_plugin = NativeAMPPlugin(trainer)
self.apex_plugin = ApexPlugin(trainer)
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
def on_trainer_init(self, plugins: Optional[Union[str, list]]):
self.plugins = plugins
if self.plugins is None:
self.plugins = []
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
self.plugins = self._convert_str_custom_plugins(self.plugins)
self.plugins = self._append_required_plugins(self.plugins)
self.__attach_ddp()
self.__attach_cluster()
self.__attach_amp()
self.__attach_apex()
def __attach_amp(self):
amp_plugin = self.__attach_plugin(NativeAMPPlugin)
if amp_plugin:
self.trainer.amp_backend = AMPType.NATIVE
self.trainer.precision_connector.backend = amp_plugin
def __attach_apex(self):
apex_plugin = self.__attach_plugin(ApexPlugin)
if apex_plugin:
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
self.trainer.amp_backend = AMPType.APEX
self.trainer.precision_connector.backend = apex_plugin
def __attach_plugin(self, plugin_type, limit=1):
count = 0
plugin_result = None
for plugin in self.plugins:
if isinstance(plugin, plugin_type):
# count the clusters
count += 1
if count > limit:
m = f'you can only use one {plugin_type.__class__} in plugins. You passed in: {count}'
raise MisconfigurationException(m)
plugin_result = plugin
return plugin_result
def __attach_ddp(self, limit=1):
count = 0
for plugin in self.plugins:
if isinstance(plugin, DDPPlugin):
# count the clusters
count += 1
if count > limit:
m = f'you can only use one DDP plugin in plugins. You passed in: {count}'
raise MisconfigurationException(m)
# set the ddp plugin
self.ddp_plugin = plugin
def __attach_cluster(self, limit=1):
num_clusters = 0
for plugin in self.plugins:
if isinstance(plugin, ClusterEnvironment):
# count the clusters
num_clusters += 1
if num_clusters > limit:
m = f'you can only use one cluster environment in plugins. You passed in: {num_clusters}'
raise MisconfigurationException(m)
# set the cluster
self.cloud_environment = plugin
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
def _convert_str_custom_plugins(self, plugins: Union[str, list]):
"""
Converts string inputs to corresponding supported lightning plugins.
Args:
plugins: List of plugins or string to choose lightning plugin.
Returns: List of plugins where strings are now plugins.
"""
if isinstance(plugins, str):
return [self._convert_str_to_plugin(plugins)]
return [self._convert_str_to_plugin(plugin) for plugin in plugins]
def _convert_str_to_plugin(self, plugin):
if isinstance(plugin, str):
if plugin not in LightningCustomPlugins.__members__:
raise MisconfigurationException(
f" {plugin} is not a supported lightning custom plugin."
" If you're trying to pass a custom plugin, please pass this as an object to"
" Trainer(plugins=[MyPlugin()]."
f" Supported plugins as string input: {[e.name for e in LightningCustomPlugins]}."
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
)
plugin_cls = LightningCustomPlugins[plugin].value
return plugin_cls(trainer=self.trainer)
return plugin
def _append_required_plugins(self, plugins: List[LightningPlugin]):
"""
Allows custom plugins to define additional plugins. This is useful for when custom plugins
need to enforce override of native amp/apex when they are enabled.
Args:
plugins: List of plugins
Returns: List of plugins containing additional plugins if needed.
Example::
class MyPlugin(DDPPlugin):
def required_plugins(self):
return [MyCustomAMPPlugin()]
# Will automatically add the necessary AMP plugin
trainer = Trainer(plugins=[MyPlugin()])
# Crash as MyPlugin enforces custom AMP plugin
trainer = Trainer(plugins=[MyPlugin(), NativeAMPPlugin()])
"""
for plugin in plugins:
required_plugins = plugin.required_plugins(amp_backend=self.trainer.amp_backend, trainer=self.trainer)
if required_plugins:
rank_zero_warn(
f'plugin {type(plugin)} has added additional required plugins as default:'
f' {[type(x) for x in required_plugins]}'
'Extend this plugin and override `required_plugins`'
'if this conflicts with your additional plugins.'
)
plugins += required_plugins
return plugins
@classmethod
def available_plugins(cls):
"""
[feat] pp 2/n (#5026) * Added changes for RPC plugin * Add missing kwargs * Fix code format * Loading refactors by introducing is_distributed var, fix optimizer step flow * Add rpc guard * Added docstrings and typing * resolve comments * Add additional rpc hook, refactor name of exit process hook for clarity * remove annotation * Modify behaviour to allow optional return, add test for rpc plugin * resolve tests * rename is_ddp_based * update * update for windows * update * resolve test * code smell * Added sequential plugin * resolve bug * update * cleanup * add Exception * resolve docs * Remove ddp support * Revert distributed -> ddp * Update pl_examples/basic_examples/conv_sequential_example.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pl_examples/basic_examples/conv_sequential_example.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Address code review points * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Add missing return * Fix formatting, add datamodule args * add small comment * resolve comments * resolve comments * update source for fairscale * update extras * remove staticmethod * resolve flake8 * Skip tests that are failing due to bug upstream with multiple optimizers and shard * update * update on comments * clean test * latest comments * remove old comments * add todo * Update version * update * resolve bugs * resolve bugs * update test * remove hanging test * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * resolve on comments * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * resolve on comments * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * remove ImportError Co-authored-by: SeanNaren <sean@grid.ai> Co-authored-by: Sean Naren <sean.narenthiran@gmail.com> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
2020-12-09 12:56:51 +00:00
List of all available plugins that can be string arguments to the trainer.
Returns: List of all available plugins that are supported as string arguments.
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
"""
return [e.name for e in LightningCustomPlugins]
class LightningCustomPlugins(Enum):
"""
[feat] pp 2/n (#5026) * Added changes for RPC plugin * Add missing kwargs * Fix code format * Loading refactors by introducing is_distributed var, fix optimizer step flow * Add rpc guard * Added docstrings and typing * resolve comments * Add additional rpc hook, refactor name of exit process hook for clarity * remove annotation * Modify behaviour to allow optional return, add test for rpc plugin * resolve tests * rename is_ddp_based * update * update for windows * update * resolve test * code smell * Added sequential plugin * resolve bug * update * cleanup * add Exception * resolve docs * Remove ddp support * Revert distributed -> ddp * Update pl_examples/basic_examples/conv_sequential_example.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pl_examples/basic_examples/conv_sequential_example.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Address code review points * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Add missing return * Fix formatting, add datamodule args * add small comment * resolve comments * resolve comments * update source for fairscale * update extras * remove staticmethod * resolve flake8 * Skip tests that are failing due to bug upstream with multiple optimizers and shard * update * update on comments * clean test * latest comments * remove old comments * add todo * Update version * update * resolve bugs * resolve bugs * update test * remove hanging test * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * resolve on comments * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * resolve on comments * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * Update pytorch_lightning/plugins/ddp_sequential_plugin.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * remove ImportError Co-authored-by: SeanNaren <sean@grid.ai> Co-authored-by: Sean Naren <sean.narenthiran@gmail.com> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
2020-12-09 12:56:51 +00:00
String support for custom lightning plugins.
Allows easier access to custom lightning plugins from the command line.
Allow string plugins (#4888) * Allow plugin to be chosen via string * Fix implementation, add tests * Fix codefactor issues * Added missing env patch * Skip test for windows * Reword reason * Add skip to invalid test * Create required_plugins function, move sharded amp requirement to plugin * Pass AMPType, fix setter for apex * Better doc strings * Add exception when using apex * Add trainer available_plugins function, warn user when plugins have been added automatically with option to override behaviour * Fixed pep8 indent * Fix codefactor issues * Add env variables * Update pytorch_lightning/cluster_environments/cluster_environment.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed code review * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/plugins/plugin_connector.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Addressed more code review feedback * Fixed docstrings * Swapped to verbose runtime error * Apply suggestions from code review * Apply suggestions from code review * Update pytorch_lightning/plugins/sharded_plugin.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Change name * Pass trainer to plugins that may require it * Fix sharded plugin * Added test to ensure string sharded works * Removed trainer typing as this breaks pep8 * Fixed doc issues * Fixed tests Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
2020-12-01 20:30:49 +00:00
"""
ddp_sharded = DDPShardedPlugin
native_amp = NativeAMPPlugin
apex_amp = ApexPlugin