Fix typo in docstring (#5835)
Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
This commit is contained in:
parent
e429f97b67
commit
0a50bb406f
|
@ -104,11 +104,12 @@ class ApexPlugin(PrecisionPlugin):
|
|||
"""
|
||||
This code is a modification of :meth:`torch.nn.utils.clip_grad_norm_` using a higher epsilon for fp16 weights.
|
||||
This is important when setting amp_level to O2, and the master weights are in fp16.
|
||||
|
||||
Args:
|
||||
grad_clip_val: Maximum norm of gradients.
|
||||
optimizer: Optimizer with gradients that will be clipped.
|
||||
norm_type: (float or int): type of the used p-norm. Can be ``'inf'`` for
|
||||
infinity norm.
|
||||
infinity norm.
|
||||
"""
|
||||
model = self.trainer.get_model()
|
||||
parameters = model.parameters()
|
||||
|
|
Loading…
Reference in New Issue