Fix spelling errors (#18773)

fix spelling errors
This commit is contained in:
Adrian Wälchli 2023-10-30 22:14:52 +01:00 committed by GitHub
parent 079544a902
commit e0ba4d46e1
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
7 changed files with 7 additions and 7 deletions

View File

@ -115,7 +115,7 @@ See the Full Example
.. note::
When running a Lighting App on your local machine, any :class:`~lightning.app.utilities.packaging.cloud_compute.CloudCompute`
When running a Lightning App on your local machine, any :class:`~lightning.app.utilities.packaging.cloud_compute.CloudCompute`
configuration (including a :class:`~lightning.app.storage.mount.Mount`) is ignored at runtime. If you need access to
these files on your local disk, you should download a copy of them to your machine.

View File

@ -186,7 +186,7 @@ configuration files and automatic creation of objects, so you don't need to do i
.. note::
Lighting automatically registers all subclasses of :class:`~lightning.pytorch.core.LightningModule`,
Lightning automatically registers all subclasses of :class:`~lightning.pytorch.core.LightningModule`,
so the complete import path is not required for them and can be replaced by the class name.
.. note::

View File

@ -175,7 +175,7 @@ Configure all aspects of Lightning for advanced usecases.
.. displayitem::
:header: Level 16: Own the training loop
:description: Learn all the ways of owning your raw PyTorch loops with Lighting.
:description: Learn all the ways of owning your raw PyTorch loops with Lightning.
:col_css: col-md-6
:button_link: levels/advanced_level_17.html
:height: 150

View File

@ -31,7 +31,7 @@ Configure all aspects of Lightning for advanced usecases.
.. displayitem::
:header: Level 16: Own the training loop
:description: Learn all the ways of owning your raw PyTorch loops with Lighting.
:description: Learn all the ways of owning your raw PyTorch loops with Lightning.
:col_css: col-md-6
:button_link: advanced_level_17.html
:height: 150

View File

@ -74,7 +74,7 @@
- rely on Torch native AMP
- `PR12312`_
* - used ``LightingModule.use_amp`` attribute
* - used ``LightningModule.use_amp`` attribute
- rely on Torch native AMP
- `PR12315`_

View File

@ -29,7 +29,7 @@ The main architecture is the following:
<img src="https://pl-public-data.s3.amazonaws.com/assets_lightning/examples/fabric/reinforcement-learning/fabric_coupled.png">
</p>
where `N+1` processes (labelled *rank-0*, ..., *rank-N* in the figure above) will be spawned by Fabric/PyTorch, each of them running `M+1` independent copies of the environment (*Env-0*, ..., *Env-M*). Every rank has its own copy of the agent, represented by a [LightningModule](https://lightning.ai/docs/pytorch/stable/common/lightning_module.html)/[Pytorch Module](https://pytorch.org/docs/stable/generated/torch.nn.Module.html), which will be updated through distributed training.
where `N+1` processes (labelled *rank-0*, ..., *rank-N* in the figure above) will be spawned by Fabric/PyTorch, each of them running `M+1` independent copies of the environment (*Env-0*, ..., *Env-M*). Every rank has its own copy of the agent, represented by a [LightningModule](https://lightning.ai/docs/pytorch/stable/common/lightning_module.html)/[PyTorch Module](https://pytorch.org/docs/stable/generated/torch.nn.Module.html), which will be updated through distributed training.
### Raw PyTorch:

View File

@ -46,7 +46,7 @@ def from_compiled(model: "torch._dynamo.OptimizedModule") -> "pl.LightningModule
if not isinstance(orig_module, pl.LightningModule):
_check_mixed_imports(model)
raise ValueError(
f"`model` is expected to be a compiled LightingModule. Found a `{type(orig_module).__name__}` instead"
f"`model` is expected to be a compiled LightningModule. Found a `{type(orig_module).__name__}` instead"
)
orig_module._compiler_ctx = {