lightning/docs/source/extensions/plugins.rst

93 lines
2.5 KiB
ReStructuredText
Raw Normal View History

.. _plugins:
#######
Plugins
#######
.. include:: ../links.rst
Plugins allow custom integrations to the internals of the Trainer such as a custom precision or
distributed implementation.
Under the hood, the Lightning Trainer is using plugins in the training routine, added automatically
depending on the provided Trainer arguments. For example:
.. code-block:: python
# accelerator: GPUAccelerator
# training strategy: DDPStrategy
# precision: NativeMixedPrecisionPlugin
trainer = Trainer(accelerator="gpu", devices=4, precision=16)
We expose Accelerators and Plugins mainly for expert users that want to extend Lightning for:
- New hardware (like TPU plugin)
- Distributed backends (e.g. a backend not yet supported by
`PyTorch <https://pytorch.org/docs/stable/distributed.html#backends>`_ itself)
- Clusters (e.g. customized access to the cluster's environment interface)
There are two types of Plugins in Lightning with different responsibilities:
Strategy
--------
- Launching and teardown of training processes (if applicable)
- Setup communication between processes (NCCL, GLOO, MPI, ...)
- Provide a unified communication interface for reduction, broadcast, etc.
- Provide access to the wrapped LightningModule
2022-02-17 01:27:51 +00:00
Furthermore, for multi-node training Lightning provides cluster environment plugins that allow the advanced user
to configure Lightning to integrate with a :ref:`custom-cluster`.
.. image:: ../_static/images/accelerator/overview.svg
The full list of built-in plugins is listed below.
.. warning:: The Plugin API is in beta and subject to change.
For help setting up custom plugins/accelerators, please reach out to us at **support@pytorchlightning.ai**
Precision Plugins
-----------------
.. currentmodule:: pytorch_lightning.plugins.precision
.. autosummary::
:nosignatures:
:template: classtemplate.rst
ApexMixedPrecisionPlugin
DeepSpeedPrecisionPlugin
DoublePrecisionPlugin
FullyShardedNativeMixedPrecisionPlugin
HPUPrecisionPlugin
IPUPrecisionPlugin
MixedPrecisionPlugin
NativeMixedPrecisionPlugin
PrecisionPlugin
ShardedNativeMixedPrecisionPlugin
TPUBf16PrecisionPlugin
TPUPrecisionPlugin
Cluster Environments
--------------------
.. currentmodule:: pytorch_lightning.plugins.environments
.. autosummary::
:nosignatures:
:template: classtemplate.rst
ClusterEnvironment
LightningEnvironment
Add LSF support (#5102) * add ClusterEnvironment for LSF systems * update init file * add available cluster environments * clean up LSFEnvironment * add ddp_hpc as a distributed backend * clean up SLURMEnvironment * remove extra blank line * init device for DDPHPCAccelerator We need to do this so we don't send the model to the same device from multiple ranks * committing current state * add additional methods to ClusterEnvironments * add NVIDIA mixin for setting up CUDA envars * remove troubleshooting prints * cleanup SLURMEnvironment * fix docstring * cleanup TorchElasticEnvironment and add documentation * PEP8 puts a cork in it * add set_ranks_to_trainer * remove unused import * move to new location * update LSF environment * remove mixin * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * changelog * reset slurm env * add tests * add licence * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * test node_rank * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * add lsf env to docs * add auto detection for lsf environment * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * fix is_using_lsf() and test * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2021-07-09 14:14:26 +00:00
LSFEnvironment
TorchElasticEnvironment
KubeflowEnvironment
SLURMEnvironment