Document Gradient Clipping during Manual Optimization (#16023)

Co-authored-by: Nikhil Shenoy <nikhilshenoy@dhcp-128-189-224-163.ubcsecure.wireless.ubc.ca>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
This commit is contained in:
Nikhil Shenoy 2022-12-13 13:21:59 -08:00 committed by GitHub
parent 3e664c906b
commit 53759825bb
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
2 changed files with 39 additions and 2 deletions

View File

@ -98,6 +98,39 @@ after every ``N`` steps, you can do as such.
opt.step()
opt.zero_grad()
Gradient Clipping
=================
You can clip optimizer gradients during manual optimization similar to passing the ``gradient_clip_val`` and
``gradient_clip_algorithm`` argument in :ref:`Trainer <trainer>` during automatic optimization.
To perform gradient clipping with one optimizer with manual optimization, you can do as such.
.. testcode:: python
from pytorch_lightning import LightningModule
class SimpleModel(LightningModule):
def __init__(self):
super().__init__()
self.automatic_optimization = False
def training_step(self, batch, batch_idx):
opt = self.optimizers()
# compute loss
loss = self.compute_loss(batch)
opt.zero_grad()
self.manual_backward(loss)
# clip gradients
self.clip_gradients(opt, gradient_clip_val=0.5, gradient_clip_algorithm="norm")
opt.step()
.. warning::
* Note that ``configure_gradient_clipping()`` won't be called in Manual Optimization. Instead consider using ``self. clip_gradients()`` manually like in the example above.
Use Multiple Optimizers (like GANs)
===================================

View File

@ -1471,8 +1471,12 @@ class LightningModule(
"""Handles gradient clipping internally.
Note:
Do not override this method. If you want to customize gradient clipping, consider
using :meth:`configure_gradient_clipping` method.
- Do not override this method. If you want to customize gradient clipping, consider using
:meth:`configure_gradient_clipping` method.
- For manual optimization (``self.automatic_optimization = False``), if you want to use
gradient clipping, consider calling
``self.clip_gradients(opt, gradient_clip_val=0.5, gradient_clip_algorithm="norm")``
manually in the training step.
Args:
optimizer: Current optimizer being used.