From 4bc2080c71adc96e6e2e3912ef0f515b1c8a2867 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Carlos=20Mochol=C3=AD?= Date: Wed, 11 Jan 2023 15:45:09 +0100 Subject: [PATCH] Remove lightning_transformers from our docs (#16335) --- .../source-pytorch/ecosystem/transformers.rst | 47 ------------------- 1 file changed, 47 deletions(-) delete mode 100644 docs/source-pytorch/ecosystem/transformers.rst diff --git a/docs/source-pytorch/ecosystem/transformers.rst b/docs/source-pytorch/ecosystem/transformers.rst deleted file mode 100644 index d49e4ba54e..0000000000 --- a/docs/source-pytorch/ecosystem/transformers.rst +++ /dev/null @@ -1,47 +0,0 @@ -:orphan: - -Lightning Transformers -====================== - -`Lightning Transformers `_ offers a flexible interface for training and fine-tuning SOTA Transformer models -using the :doc:`PyTorch Lightning Trainer <../common/trainer>`. - -.. code-block:: bash - - pip install lightning-transformers - -In Lightning Transformers, we offer the following benefits: - -- Powered by `PyTorch Lightning `_ - Accelerators, custom Callbacks, Loggers, and high performance scaling with minimal changes. -- Backed by `HuggingFace Transformers `_ models and datasets, spanning multiple modalities and tasks within NLP/Audio and Vision. -- Task Abstraction for Rapid Research & Experimentation - Build your own custom transformer tasks across all modalities with little friction. -- Powerful config composition backed by `Hydra `_ - simply swap out models, optimizers, schedulers task, and many more configurations without touching the code. -- Seamless Memory and Speed Optimizations - Out-of-the-box training optimizations such as `DeepSpeed ZeRO `_ or `FairScale Sharded Training `_ with no code changes. - ------------------ - -Using Lightning-Transformers ----------------------------- - -Lightning Transformers has a collection of tasks for common NLP problems such as `language_modeling `_, -`translation `_ and more. To use, simply: - -1. Pick a task to train (passed to ``train.py`` as ``task=``) - -2. Pick a dataset (passed to ``train.py`` as ``dataset=``) - -3. Customize the backbone, optimizer, or any component within the config - -4. Add any :doc:`Lightning supported parameters and optimizations <../common/trainer>` - -.. code-block:: bash - - python train.py \ - task= \ - dataset= - backbone.pretrained_model_name_or_path= # Optionally change the HF backbone - optimizer= # Optionally specify optimizer (Default AdamW) - trainer. # Optionally specify Lightning trainer arguments - - -To learn more about Lightning Transformers, please refer to the `Lightning Transformers documentation `_.