Fix typo in docstring (#5835)

Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
This commit is contained in:
Gyeongjae Choi 2021-02-08 23:53:46 +09:00 committed by GitHub
parent e429f97b67
commit 0a50bb406f
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 2 additions and 1 deletions

View File

@ -104,11 +104,12 @@ class ApexPlugin(PrecisionPlugin):
"""
This code is a modification of :meth:`torch.nn.utils.clip_grad_norm_` using a higher epsilon for fp16 weights.
This is important when setting amp_level to O2, and the master weights are in fp16.
Args:
grad_clip_val: Maximum norm of gradients.
optimizer: Optimizer with gradients that will be clipped.
norm_type: (float or int): type of the used p-norm. Can be ``'inf'`` for
infinity norm.
infinity norm.
"""
model = self.trainer.get_model()
parameters = model.parameters()