Document the return value of `Fabric.clip_gradients()` (#19457)
This commit is contained in:
parent
0a77aa2cc5
commit
4bcc4f1cf7
|
@ -108,6 +108,7 @@ This is useful if your model experiences *exploding gradients* during training.
|
|||
fabric.clip_gradients(model, optimizer, max_norm=2.0, norm_type="inf")
|
||||
|
||||
The :meth:`~lightning.fabric.fabric.Fabric.clip_gradients` method is agnostic to the precision and strategy being used.
|
||||
If you pass `max_norm` as the argument, ``clip_gradients`` will return the total norm of the gradients (before clipping was applied) as a scalar tensor.
|
||||
|
||||
|
||||
to_device
|
||||
|
|
|
@ -468,6 +468,10 @@ class Fabric:
|
|||
Default is the 2-norm.
|
||||
error_if_nonfinite: An error is raised if the total norm of the gradients is NaN or infinite.
|
||||
|
||||
Return:
|
||||
The total norm of the gradients (before clipping was applied) as a scalar tensor if ``max_norm`` was
|
||||
passed, otherwise ``None``.
|
||||
|
||||
"""
|
||||
if clip_val is not None and max_norm is not None:
|
||||
raise ValueError(
|
||||
|
|
Loading…
Reference in New Issue