From 9cd6d0f6addd1f359e47d34e0e226163dca8b008 Mon Sep 17 00:00:00 2001 From: Kaushik B <45285388+kaushikb11@users.noreply.github.com> Date: Tue, 29 Mar 2022 08:06:13 +0530 Subject: [PATCH] Add HPU Accelerator column to the precision doc (#12499) --- docs/source/advanced/precision.rst | 9 +++++++-- 1 file changed, 7 insertions(+), 2 deletions(-) diff --git a/docs/source/advanced/precision.rst b/docs/source/advanced/precision.rst index 9dd31f0d65..13900ffb5b 100644 --- a/docs/source/advanced/precision.rst +++ b/docs/source/advanced/precision.rst @@ -20,7 +20,7 @@ Higher precision, such as the 64-bit floating-point, can be used for highly sens Following are the precisions available in Lightning along with their supported Accelerator: .. list-table:: Precision with Accelerators - :widths: 20 20 20 20 20 + :widths: 20 20 20 20 20 20 :header-rows: 1 * - Precision @@ -28,26 +28,31 @@ Following are the precisions available in Lightning along with their supported A - GPU - TPU - IPU + - HPU * - 16 - No - Yes - No - Yes + - No * - BFloat16 - Yes - Yes - Yes - No + - Yes * - 32 - Yes - Yes - Yes - Yes + - Yes * - 64 - Yes - Yes - No - No + - No *************** @@ -224,4 +229,4 @@ You can also customize and pass your own Precision Plugin by subclassing the :cl *************** It is possible to further reduce the precision using third-party libraries like `bitsandbytes `_. Although, -Lightning doesn't support it out of the box yet but you can still use it by configuring it in your LightningModule and setting ``Trainer(precision=32)``. +Lightning doesn't support it out of the box yet but you can still use it by configuring it in your :class:`~pytorch_lightning.core.lightning.LightningModule` and setting ``Trainer(precision=32)``.