2020-11-08 22:24:41 +00:00
|
|
|
############
|
|
|
|
Accelerators
|
|
|
|
############
|
|
|
|
Accelerators connect a Lightning Trainer to arbitrary accelerators (CPUs, GPUs, TPUs, etc). Accelerators
|
|
|
|
also manage distributed accelerators (like DP, DDP, HPC cluster).
|
|
|
|
|
|
|
|
Accelerators can also be configured to run on arbitrary clusters using Plugins or to link up to arbitrary
|
|
|
|
computational strategies like 16-bit precision via AMP and Apex.
|
|
|
|
|
2021-02-03 20:21:19 +00:00
|
|
|
**For help setting up custom plugin/accelerator please reach out to us at support@pytorchlightning.ai**
|