diff --git a/docs/source-app/workflows/mount_cloud_object_store.rst b/docs/source-app/workflows/mount_cloud_object_store.rst index 72e6fa0350..31ac2a44aa 100644 --- a/docs/source-app/workflows/mount_cloud_object_store.rst +++ b/docs/source-app/workflows/mount_cloud_object_store.rst @@ -115,7 +115,7 @@ See the Full Example .. note:: - When running a Lighting App on your local machine, any :class:`~lightning.app.utilities.packaging.cloud_compute.CloudCompute` + When running a Lightning App on your local machine, any :class:`~lightning.app.utilities.packaging.cloud_compute.CloudCompute` configuration (including a :class:`~lightning.app.storage.mount.Mount`) is ignored at runtime. If you need access to these files on your local disk, you should download a copy of them to your machine. diff --git a/docs/source-pytorch/cli/lightning_cli_advanced.rst b/docs/source-pytorch/cli/lightning_cli_advanced.rst index 7a6ed2c96b..9f73533772 100644 --- a/docs/source-pytorch/cli/lightning_cli_advanced.rst +++ b/docs/source-pytorch/cli/lightning_cli_advanced.rst @@ -186,7 +186,7 @@ configuration files and automatic creation of objects, so you don't need to do i .. note:: - Lighting automatically registers all subclasses of :class:`~lightning.pytorch.core.LightningModule`, + Lightning automatically registers all subclasses of :class:`~lightning.pytorch.core.LightningModule`, so the complete import path is not required for them and can be replaced by the class name. .. note:: diff --git a/docs/source-pytorch/expertise_levels.rst b/docs/source-pytorch/expertise_levels.rst index d1b074498d..5988890332 100644 --- a/docs/source-pytorch/expertise_levels.rst +++ b/docs/source-pytorch/expertise_levels.rst @@ -175,7 +175,7 @@ Configure all aspects of Lightning for advanced usecases. .. displayitem:: :header: Level 16: Own the training loop - :description: Learn all the ways of owning your raw PyTorch loops with Lighting. + :description: Learn all the ways of owning your raw PyTorch loops with Lightning. :col_css: col-md-6 :button_link: levels/advanced_level_17.html :height: 150 diff --git a/docs/source-pytorch/levels/advanced.rst b/docs/source-pytorch/levels/advanced.rst index d2b0fc4370..1ea809d6fa 100644 --- a/docs/source-pytorch/levels/advanced.rst +++ b/docs/source-pytorch/levels/advanced.rst @@ -31,7 +31,7 @@ Configure all aspects of Lightning for advanced usecases. .. displayitem:: :header: Level 16: Own the training loop - :description: Learn all the ways of owning your raw PyTorch loops with Lighting. + :description: Learn all the ways of owning your raw PyTorch loops with Lightning. :col_css: col-md-6 :button_link: advanced_level_17.html :height: 150 diff --git a/docs/source-pytorch/upgrade/sections/1_7_devel.rst b/docs/source-pytorch/upgrade/sections/1_7_devel.rst index 11fab55db5..211b56e861 100644 --- a/docs/source-pytorch/upgrade/sections/1_7_devel.rst +++ b/docs/source-pytorch/upgrade/sections/1_7_devel.rst @@ -74,7 +74,7 @@ - rely on Torch native AMP - `PR12312`_ - * - used ``LightingModule.use_amp`` attribute + * - used ``LightningModule.use_amp`` attribute - rely on Torch native AMP - `PR12315`_ diff --git a/examples/fabric/reinforcement_learning/README.md b/examples/fabric/reinforcement_learning/README.md index 14677c1ef0..0a8cac757c 100644 --- a/examples/fabric/reinforcement_learning/README.md +++ b/examples/fabric/reinforcement_learning/README.md @@ -29,7 +29,7 @@ The main architecture is the following:

-where `N+1` processes (labelled *rank-0*, ..., *rank-N* in the figure above) will be spawned by Fabric/PyTorch, each of them running `M+1` independent copies of the environment (*Env-0*, ..., *Env-M*). Every rank has its own copy of the agent, represented by a [LightningModule](https://lightning.ai/docs/pytorch/stable/common/lightning_module.html)/[Pytorch Module](https://pytorch.org/docs/stable/generated/torch.nn.Module.html), which will be updated through distributed training. +where `N+1` processes (labelled *rank-0*, ..., *rank-N* in the figure above) will be spawned by Fabric/PyTorch, each of them running `M+1` independent copies of the environment (*Env-0*, ..., *Env-M*). Every rank has its own copy of the agent, represented by a [LightningModule](https://lightning.ai/docs/pytorch/stable/common/lightning_module.html)/[PyTorch Module](https://pytorch.org/docs/stable/generated/torch.nn.Module.html), which will be updated through distributed training. ### Raw PyTorch: diff --git a/src/lightning/pytorch/utilities/compile.py b/src/lightning/pytorch/utilities/compile.py index 0b38c4d794..a77ed553d4 100644 --- a/src/lightning/pytorch/utilities/compile.py +++ b/src/lightning/pytorch/utilities/compile.py @@ -46,7 +46,7 @@ def from_compiled(model: "torch._dynamo.OptimizedModule") -> "pl.LightningModule if not isinstance(orig_module, pl.LightningModule): _check_mixed_imports(model) raise ValueError( - f"`model` is expected to be a compiled LightingModule. Found a `{type(orig_module).__name__}` instead" + f"`model` is expected to be a compiled LightningModule. Found a `{type(orig_module).__name__}` instead" ) orig_module._compiler_ctx = {