Add HPU Accelerator column to the precision doc (#12499)

This commit is contained in:
Kaushik B 2022-03-29 08:06:13 +05:30 committed by GitHub
parent 486f07be7b
commit 9cd6d0f6ad
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 7 additions and 2 deletions

View File

@ -20,7 +20,7 @@ Higher precision, such as the 64-bit floating-point, can be used for highly sens
Following are the precisions available in Lightning along with their supported Accelerator:
.. list-table:: Precision with Accelerators
:widths: 20 20 20 20 20
:widths: 20 20 20 20 20 20
:header-rows: 1
* - Precision
@ -28,26 +28,31 @@ Following are the precisions available in Lightning along with their supported A
- GPU
- TPU
- IPU
- HPU
* - 16
- No
- Yes
- No
- Yes
- No
* - BFloat16
- Yes
- Yes
- Yes
- No
- Yes
* - 32
- Yes
- Yes
- Yes
- Yes
- Yes
* - 64
- Yes
- Yes
- No
- No
- No
***************
@ -224,4 +229,4 @@ You can also customize and pass your own Precision Plugin by subclassing the :cl
***************
It is possible to further reduce the precision using third-party libraries like `bitsandbytes <https://github.com/facebookresearch/bitsandbytes>`_. Although,
Lightning doesn't support it out of the box yet but you can still use it by configuring it in your LightningModule and setting ``Trainer(precision=32)``.
Lightning doesn't support it out of the box yet but you can still use it by configuring it in your :class:`~pytorch_lightning.core.lightning.LightningModule` and setting ``Trainer(precision=32)``.