docs: update broken links & latest/stable (#16994)

* docs: update links to PL latest

* also stable

* last

* Apply suggestions from code review

* Apply suggestions from code review

Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>

* fixing

* .

* fabric

* fixing

* .

---------

Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
This commit is contained in:
Jirka Borovec 2023-03-15 20:19:41 +01:00 committed by GitHub
parent a9d78dc8d0
commit 2f087ae30e
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
30 changed files with 60 additions and 61 deletions

View File

@ -33,7 +33,6 @@ body:
- [**Metrics**](https://github.com/Lightning-AI/metrics):
Machine learning metrics for distributed, scalable PyTorch applications.
- [**Lite**](https://pytorch-lightning.readthedocs.io/en/latest/starter/lightning_lite.html):
enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
- [**Flash**](https://github.com/Lightning-AI/lightning-flash):
The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.

View File

@ -39,7 +39,6 @@ body:
- [**Metrics**](https://github.com/Lightning-AI/metrics):
Machine learning metrics for distributed, scalable PyTorch applications.
- [**Lite**](https://pytorch-lightning.readthedocs.io/en/latest/starter/lightning_lite.html):
enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
- [**Flash**](https://github.com/Lightning-AI/lightning-flash):
The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.

View File

@ -22,7 +22,6 @@ body:
- [**Metrics**](https://github.com/Lightning-AI/metrics):
Machine learning metrics for distributed, scalable PyTorch applications.
- [**Lite**](https://pytorch-lightning.readthedocs.io/en/latest/starter/lightning_lite.html):
enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
- [**Flash**](https://github.com/Lightning-AI/lightning-flash):
The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.

4
.github/stale.yml vendored
View File

@ -29,8 +29,8 @@ pulls:
markComment: >
This pull request has been automatically marked as stale because it has not had recent activity.
It will be closed in 7 days if no further activity occurs. If you need further help see our docs:
https://pytorch-lightning.readthedocs.io/en/latest/generated/CONTRIBUTING.html#pull-request
or ask the assistance of a core contributor here or on Dicord.
https://lightning.ai/docs/pytorch/latest/generated/CONTRIBUTING.html#pull-request
or ask the assistance of a core contributor here or on Discord.
Thank you for your contributions.
# Comment to post when closing a stale issue. Set to `false` to disable
closeComment: >

View File

@ -20,7 +20,7 @@ ______________________________________________________________________
<a href="src/lightning_app/README.md">Lightning Apps</a>
<a href="https://pytorch-lightning.readthedocs.io/en/stable/">Docs</a>
<a href="#community">Community</a>
<a href="https://pytorch-lightning.readthedocs.io/en/stable/generated/CONTRIBUTING.html">Contribute</a>
<a href="https://lightning.ai/docs/pytorch/stable/generated/CONTRIBUTING.html">Contribute</a>
</p>
<!-- DO NOT ADD CONDA DOWNLOADS... README CHANGES MUST BE APPROVED BY EDEN OR WILL -->
@ -32,7 +32,7 @@ ______________________________________________________________________
[![DockerHub](https://img.shields.io/docker/pulls/pytorchlightning/pytorch_lightning.svg)](https://hub.docker.com/r/pytorchlightning/pytorch_lightning)
[![codecov](https://codecov.io/gh/Lightning-AI/lightning/branch/master/graph/badge.svg?token=SmzX8mnKlA)](https://codecov.io/gh/Lightning-AI/lightning)
[![ReadTheDocs](https://readthedocs.org/projects/pytorch-lightning/badge/?version=stable)](https://pytorch-lightning.readthedocs.io/en/stable/)
[![ReadTheDocs](https://readthedocs.org/projects/pytorch-lightning/badge/?version=stable)](https://lightning.ai/docs/pytorch/stable/)
[![Discord](https://img.shields.io/discord/1077906959069626439?style=plastic)](https://discord.gg/VptPCZkGNa)
[![license](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://github.com/Lightning-AI/lightning/blob/master/LICENSE)
@ -111,7 +111,7 @@ trainer.fit(autoencoder, DataLoader(train), DataLoader(val))
## Advanced features
Lightning has over [40+ advanced features](https://pytorch-lightning.readthedocs.io/en/latest/common/trainer.html#trainer-flags) designed for professional AI research at scale.
Lightning has over [40+ advanced features](https://lightning.ai/docs/pytorch/stable/common/trainer.html#trainer-flags) designed for professional AI research at scale.
Here are some examples:
@ -479,10 +479,10 @@ ______________________________________________________________________
The lightning community is maintained by
- [10+ core contributors](https://pytorch-lightning.readthedocs.io/en/latest/governance.html) who are all a mix of professional engineers, Research Scientists, and Ph.D. students from top AI labs.
- [10+ core contributors](https://lightning.ai/docs/pytorch/latest/governance.html) who are all a mix of professional engineers, Research Scientists, and Ph.D. students from top AI labs.
- 590+ active community contributors.
Want to help us build Lightning and reduce boilerplate for thousands of researchers? [Learn how to make your first contribution here](https://pytorch-lightning.readthedocs.io/en/stable/generated/CONTRIBUTING.html)
Want to help us build Lightning and reduce boilerplate for thousands of researchers? [Learn how to make your first contribution here](https://lightning.ai/docs/pytorch/stable/generated/CONTRIBUTING.html)
Lightning is also part of the [PyTorch ecosystem](https://pytorch.org/ecosystem/) which requires projects to have solid testing, documentation and support.

View File

@ -24,7 +24,7 @@ Let's assume you already have a folder with those two files.
train.py # your own script to train your models
requirements.txt # your python requirements.
If you don't, simply create a ``pl_project`` folder with those two files and add the following `PyTorch Lightning <https://pytorch-lightning.readthedocs.io/en/latest/>`_ code in the ``train.py`` file. This code trains a simple ``AutoEncoder`` on `MNIST Dataset <https://en.wikipedia.org/wiki/MNIST_database>`_.
If you don't, simply create a ``pl_project`` folder with those two files and add the following `PyTorch Lightning <https://lightning.ai/docs/pytorch/latest/>`_ code in the ``train.py`` file. This code trains a simple ``AutoEncoder`` on `MNIST Dataset <https://en.wikipedia.org/wiki/MNIST_database>`_.
.. literalinclude:: ../code_samples/convert_pl_to_app/train.py
@ -39,7 +39,7 @@ Simply run the following commands in your terminal to install the requirements a
pip install -r requirements.txt
python train.py
Get through `PyTorch Lightning Introduction <https://pytorch-lightning.readthedocs.io/en/stable/starter/introduction.html#step-1-define-lightningmodule>`_ to learn more.
Get through `PyTorch Lightning Introduction <https://lightning.ai/docs/pytorch/stable/starter/introduction.html#step-1-define-lightningmodule>`_ to learn more.
----

View File

@ -2,7 +2,7 @@
'github': 'https://github.com/Lightning-AI/lightning',
'github_issues': 'https://github.com/Lightning-AI/lightning/issues',
'contributing': 'https://github.com/Lightning-AI/lightning/blob/master/.github/CONTRIBUTING.md',
'governance': 'https://pytorch-lightning.readthedocs.io/en/latest/governance.html',
'governance': 'https://lightning.ai/docs/pytorch/latest/governance.html',
'docs': 'https://lightning.ai/docs/fabric/',
'twitter': 'https://twitter.com/LightningAI',
'home': 'https://lightning.ai/docs/fabric/',

View File

@ -262,7 +262,7 @@ epub_exclude_files = ["search.html"]
intersphinx_mapping = {
"python": ("https://docs.python.org/3", None),
"torch": ("https://pytorch.org/docs/stable/", None),
"pytorch_lightning": ("https://pytorch-lightning.readthedocs.io/en/stable/", None),
"pytorch_lightning": ("https://lightning.ai/docs/pytorch/stable/", None),
}
# -- Options for todo extension ----------------------------------------------

View File

@ -5,7 +5,7 @@ Organize Your Code
Any raw PyTorch can be converted to Fabric with zero refactoring required, giving maximum flexibility in how you want to organize your projects.
However, when developing a project in a team or sharing the code publicly, it can be beneficial to conform to a standard format of how core pieces of the code are organized.
This is what the `LightningModule <https://pytorch-lightning.readthedocs.io/en/stable/common/lightning_module.html>`_ was made for!
This is what the `LightningModule <https://lightning.ai/docs/pytorch/stable/common/lightning_module.html>`_ was made for!
Here is how you can neatly separate the research code (model, loss, optimization, etc.) from the "trainer" code (training loop, checkpointing, logging, etc.).
@ -60,7 +60,7 @@ Take these main ingredients and put them in a LightningModule:
...
This is a minimal LightningModule, but there are `many other useful hooks <https://pytorch-lightning.readthedocs.io/en/stable/common/lightning_module.html#hooks>`_ you can use.
This is a minimal LightningModule, but there are `many other useful hooks <https://lightning.ai/docs/pytorch/stable/common/lightning_module.html#hooks>`_ you can use.
----

View File

@ -1,3 +1,5 @@
.. include:: links.rst
################
Lightning Fabric
################
@ -67,20 +69,20 @@ Why Fabric?
|
|
Fabric differentiates itself from a fully-fledged trainer like Lightning's `Trainer <https://pytorch-lightning.readthedocs.io/en/stable/common/trainer.html>`_ in these key aspects:
Fabric differentiates itself from a fully-fledged trainer like Lightning's `Trainer`_ in these key aspects:
**Fast to implement**
There is no need to restructure your code: Just change a few lines in the PyTorch script and you'll be able to leverage Fabric features.
**Maximum Flexibility**
Write your own training and/or inference logic down to the individual optimizer calls.
You aren't forced to conform to a standardized epoch-based training loop like the one in Lightning `Trainer <https://pytorch-lightning.readthedocs.io/en/stable/common/trainer.html>`_.
You aren't forced to conform to a standardized epoch-based training loop like the one in Lightning `Trainer`_.
You can do flexible iteration based training, meta-learning, cross-validation and other types of optimization algorithms without digging into framework internals.
This also makes it super easy to adopt Fabric in existing PyTorch projects to speed-up and scale your models without the compromise on large refactors.
Just remember: With great power comes a great responsibility.
**Maximum Control**
The Lightning `Trainer <https://pytorch-lightning.readthedocs.io/en/stable/common/trainer.html>`_ has many built-in features to make research simpler with less boilerplate, but debugging it requires some familiarity with the framework internals.
The Lightning `Trainer`_ has many built-in features to make research simpler with less boilerplate, but debugging it requires some familiarity with the framework internals.
In Fabric, everything is opt-in. Think of it as a toolbox: You take out the tools (Fabric functions) you need and leave the other ones behind.
This makes it easier to develop and debug your PyTorch code as you gradually add more features to it.
Fabric provides important tools to remove undesired boilerplate code (distributed, hardware, checkpoints, logging, ...), but leaves the design and orchestration fully up to you.

View File

@ -1,2 +1,3 @@
.. _PyTorchJob: https://www.kubeflow.org/docs/components/training/pytorch/
.. _Kubeflow: https://www.kubeflow.org
.. _Trainer: https://lightning.ai/docs/pytorch/stable/common/

View File

@ -2,16 +2,16 @@
'github': 'https://github.com/Lightning-AI/lightning',
'github_issues': 'https://github.com/Lightning-AI/lightning/issues',
'contributing': 'https://github.com/Lightning-AI/lightning/blob/master/.github/CONTRIBUTING.md',
'governance': 'https://pytorch-lightning.readthedocs.io/en/latest/governance.html',
'governance': 'https://lightning.ai/docs/pytorch/latest/governance.html',
'docs': 'https://lightning.ai/docs/pytorch/latest/',
'twitter': 'https://twitter.com/LightningAI',
'discuss': 'https://www.pytorchlightning.ai/community',
'tutorials': 'https://pytorch-lightning.readthedocs.io/en/latest/#tutorials',
'tutorials': 'https://lightning.ai/docs/pytorch/latest/#tutorials',
'home': 'https://lightning.ai/docs/pytorch/latest/',
'get_started': 'https://pytorch-lightning.readthedocs.io/en/latest/starter/introduction.html',
'get_started': 'https://lightning.ai/docs/pytorch/latest/starter/introduction.html',
'features': 'https://lightning.ai/docs/pytorch/latest/',
'blog': 'https://lightning.ai/pages/blog/',
'resources': 'https://pytorch-lightning.readthedocs.io/en/latest/#community-examples',
'resources': 'https://lightning.ai/docs/pytorch/latest/#community-examples',
'support': 'https://lightning.ai/docs/pytorch/latest/',
'community': 'https://www.pytorchlightning.ai/community',
'forums': 'https://lightning.ai/forums/',

View File

@ -126,7 +126,7 @@ At last, the quantized model can be saved by:
Hands-on Examples
*****************
Based on the `given example code <https://pytorch-lightning.readthedocs.io/en/stable/notebooks/lightning_examples/text-transformers.html>`_, we show how Intel Neural Compressor conduct model quantization on PyTorch Lightning. We first define the basic config of the quantization process.
Based on the `given example code <https://lightning.ai/docs/pytorch/stable/notebooks/lightning_examples/text-transformers.html>`_, we show how Intel Neural Compressor conduct model quantization on PyTorch Lightning. We first define the basic config of the quantization process.
.. code-block:: python

View File

@ -4,7 +4,7 @@ Community Examples
==================
- `Lightning Bolts: Deep Learning components for extending PyTorch Lightning <https://pytorch-lightning.readthedocs.io/en/latest/ecosystem/bolts.html>`_.
- `Lightning Bolts: Deep Learning components for extending PyTorch Lightning <https://lightning.ai/docs/pytorch/latest/ecosystem/bolts.html>`_.
- `Lightning Flash: Your PyTorch AI Factory - Flash enables you to easily configure and run complex AI recipes <https://github.com/Lightning-AI/lightning-flash>`_.
- `Contextual Emotion Detection (DoubleDistilBert) <https://github.com/juliusberner/emotion_transformer>`_
- `Cotatron: Transcription-Guided Speech Encoder <https://github.com/mindslab-ai/cotatron>`_

View File

@ -219,5 +219,5 @@ This is true for both academic and corporate settings where data cleaning and ad
of iterating through ideas.
- Checkout the live examples to get your hands dirty:
- `Introduction to PyTorch Lightning <https://pytorch-lightning.readthedocs.io/en/stable/notebooks/lightning_examples/mnist-hello-world.html>`_
- `Introduction to DataModules <https://pytorch-lightning.readthedocs.io/en/stable/notebooks/lightning_examples/datamodules.html>`_
- `Introduction to PyTorch Lightning <https://lightning.ai/docs/pytorch/stable/notebooks/lightning_examples/mnist-hello-world.html>`_
- `Introduction to DataModules <https://lightning.ai/docs/pytorch/stable/notebooks/lightning_examples/datamodules.html>`_

View File

@ -1,7 +1,7 @@
# Examples
Our most robust examples showing all sorts of implementations
can be found in our sister library [Lightning Bolts](https://pytorch-lightning.readthedocs.io/en/latest/ecosystem/bolts.html).
can be found in our sister library [Lightning Bolts](https://lightning.ai/docs/pytorch/latest/ecosystem/bolts.html).
______________________________________________________________________
@ -13,7 +13,7 @@ ______________________________________________________________________
## Lightning Fabric Examples
We show how to accelerate your PyTorch code with [Lightning Fabric](https://pytorch-lightning.readthedocs.io/en/stable/fabric/fabric.html) with minimal code changes.
We show how to accelerate your PyTorch code with [Lightning Fabric](https://lightning.ai/docs/fabric) with minimal code changes.
You stay in full control of the training loop.
- [MNIST: Vanilla PyTorch vs. Fabric](fabric/image_classifier/README.md)
@ -33,5 +33,5 @@ ______________________________________________________________________
## Domain Examples
This folder contains older examples. You should instead use the examples
in [Lightning Bolts](https://pytorch-lightning.readthedocs.io/en/latest/ecosystem/bolts.html)
in [Lightning Bolts](https://lightning.ai/docs/pytorch/latest/ecosystem/bolts.html)
for advanced use cases.

View File

@ -3,7 +3,7 @@
This is an example of a GAN (Generative Adversarial Network) that learns to generate realistic images of faces.
We show two code versions:
The first one is implemented in raw PyTorch, but isn't easy to scale.
The second one is using [Lightning Fabric](https://pytorch-lightning.readthedocs.io/en/stable/fabric/fabric.html) to accelerate and scale the model.
The second one is using [Lightning Fabric](https://lightning.ai/docs/fabric) to accelerate and scale the model.
Tip: You can easily inspect the difference between the two files with:

View File

@ -2,7 +2,7 @@
Here are two MNIST classifiers implemented in PyTorch.
The first one is implemented in pure PyTorch, but isn't easy to scale.
The second one is using [Lightning Fabric](https://pytorch-lightning.readthedocs.io/en/stable/fabric/fabric.html) to accelerate and scale the model.
The second one is using [Lightning Fabric](https://lightning.ai/docs/fabric) to accelerate and scale the model.
Tip: You can easily inspect the difference between the two files with:
@ -23,7 +23,7 @@ ______________________________________________________________________
#### 2. Image Classifier with Lightning Fabric
This script shows you how to scale the pure PyTorch code to enable GPU and multi-GPU training using [Lightning Fabric](https://pytorch-lightning.readthedocs.io/en/stable/fabric/fabric.html).
This script shows you how to scale the pure PyTorch code to enable GPU and multi-GPU training using [Lightning Fabric](https://lightning.ai/docs/fabric).
```bash
# CPU

View File

@ -25,7 +25,7 @@ and replace ``loss.backward()`` with ``self.backward(loss)``.
Accelerate your training loop by setting the ``--accelerator``, ``--strategy``, ``--devices`` options directly from
the command line. See ``lightning run model --help`` or learn more from the documentation:
https://pytorch-lightning.readthedocs.io/en/stable/fabric/fabric.html.
https://lightning.ai/docs/fabric.
"""
import argparse

View File

@ -1,6 +1,6 @@
## K-Fold Cross Validation
This is an example of performing K-Fold cross validation supported with [Lightning Fabric](https://pytorch-lightning.readthedocs.io/en/stable/fabric/fabric.html). To learn more about cross validation, check out [this article](https://sebastianraschka.com/blog/2016/model-evaluation-selection-part3.html#introduction-to-k-fold-cross-validation).
This is an example of performing K-Fold cross validation supported with [Lightning Fabric](https://lightning.ai/docs/fabric). To learn more about cross validation, check out [this article](https://sebastianraschka.com/blog/2016/model-evaluation-selection-part3.html#introduction-to-k-fold-cross-validation).
We use the MNIST dataset to train a simple CNN model. We create the k-fold cross validation splits using the `ModelSelection.KFold` [class](https://scikit-learn.org/stable/modules/generated/sklearn.model_selection.KFold.html) in the `scikit-learn` library. Ensure that you have the `scikit-learn` library installed;
@ -10,7 +10,7 @@ pip install scikit-learn
#### Run K-Fold Image Classification with Lightning Fabric
This script shows you how to scale the pure PyTorch code to enable GPU and multi-GPU training using [Lightning Fabric](https://pytorch-lightning.readthedocs.io/en/stable/fabric/fabric.html).
This script shows you how to scale the pure PyTorch code to enable GPU and multi-GPU training using [Lightning Fabric](https://lightning.ai/docs/fabric).
```bash
# CPU

View File

@ -8,7 +8,7 @@ If you are new to meta-learning, have a look at this short [introduction video](
We show two code versions:
The first one is implemented in raw PyTorch, but it contains quite a bit of boilerplate code for distributed training.
The second one is using [Lightning Fabric](https://pytorch-lightning.readthedocs.io/en/stable/fabric/fabric.html) to accelerate and scale the model.
The second one is using [Lightning Fabric](https://lightning.ai/docs/fabric) to accelerate and scale the model.
Tip: You can easily inspect the difference between the two files with:

View File

@ -1,6 +1,6 @@
# Proximal Policy Optimization - PPO implementation powered by Lightning Fabric
This is an example of a Reinforcement Learning algorithm called [Proximal Policy Optimization (PPO)](https://arxiv.org/abs/1707.06347) implemented in PyTorch and accelerated by [Lightning Fabric](https://pytorch-lightning.readthedocs.io/en/stable/fabric/fabric.html).
This is an example of a Reinforcement Learning algorithm called [Proximal Policy Optimization (PPO)](https://arxiv.org/abs/1707.06347) implemented in PyTorch and accelerated by [Lightning Fabric](https://lightning.ai/docs/fabric).
The goal of Reinforcement Learning is to train agents to act in their surrounding environment maximizing the cumulative reward received from it. This can be depicted in the following figure:

View File

@ -19,7 +19,7 @@ __author_email__ = "pytorch@lightning.ai"
__license__ = "Apache-2.0"
__copyright__ = f"Copyright (c) 2018-{time.strftime('%Y')}, {__author__}."
__homepage__ = "https://github.com/Lightning-AI/lightning"
__docs_url__ = "https://pytorch-lightning.readthedocs.io/en/stable/"
__docs_url__ = "https://lightning.ai/docs/pytorch/stable/"
# this has to be simple string, see: https://github.com/pypa/twine/issues/522
__docs__ = (
"Use Lightning Apps to build everything from production-ready, multi-cloud ML systems to simple research demos."

View File

@ -625,7 +625,7 @@ class DeepSpeedStrategy(DDPStrategy, _Sharded):
if self.config is None:
raise ValueError(
"To use DeepSpeed you must pass in a DeepSpeed config dict, or a path to a JSON config."
" See: https://pytorch-lightning.readthedocs.io/en/stable/advanced/model_parallel.html#deepspeed"
" See: https://lightning.ai/docs/pytorch/stable/advanced/model_parallel.html#deepspeed"
)
self.config.setdefault("train_micro_batch_size_per_gpu", 1)

View File

@ -594,7 +594,7 @@ class LightningCLI:
f"`{self.__class__.__name__}.add_configure_optimizers_method_to_model` expects at most one optimizer "
f"and one lr_scheduler to be 'AUTOMATIC', but found {optimizers+lr_schedulers}. In this case the user "
"is expected to link the argument groups and implement `configure_optimizers`, see "
"https://pytorch-lightning.readthedocs.io/en/stable/common/lightning_cli.html"
"https://lightning.ai/docs/pytorch/stable/common/lightning_cli.html"
"#optimizers-and-learning-rate-schedulers"
)

View File

@ -592,7 +592,7 @@ class DeepSpeedStrategy(DDPStrategy):
if self.config is None:
raise MisconfigurationException(
"To use DeepSpeed you must pass in a DeepSpeed config dict, or a path to a JSON config."
" See: https://pytorch-lightning.readthedocs.io/en/stable/advanced/model_parallel.html#deepspeed"
" See: https://lightning.ai/docs/pytorch/stable/advanced/model_parallel.html#deepspeed"
)
self._format_batch_size_and_grad_accum_config()
self._format_precision_config()
@ -760,7 +760,7 @@ class DeepSpeedStrategy(DDPStrategy):
"When saving the DeepSpeed Stage 3 checkpoint, "
"each worker will save a shard of the checkpoint within a directory. "
"If a single file is required after training, "
"see https://pytorch-lightning.readthedocs.io/en/stable/advanced/model_parallel.html#"
"see https://lightning.ai/docs/pytorch/stable/advanced/model_parallel.html#"
"deepspeed-zero-stage-3-single-file for instructions."
)
# Use deepspeed's internal checkpointing function to handle partitioned weights across processes

View File

@ -95,15 +95,15 @@ Easy communication 🛰️ between components is supported with:
- [Directional state updates](https://lightning.ai/lightning-docs/core_api/lightning_app/communication.html?highlight=directional%20state) from the Works to the Flow creating an event: When creating interactive apps, you will likely want your components to share information with each other. You might to rely on that information to control their execution, share progress in the UI, trigger a sequence of operations, or more.
- [Storage](https://lightning.ai/lightning-docs/api_reference/storage.html): The Lightning Storage system makes it easy to share files between LightningWork so you can run your app both locally and in the cloud without changing the code.
- [Path](https://lightning.ai/lightning-docs/api_reference/generated/lightning_app.storage.path.Path.html#lightning_app.storage.path.Path): The Path object is a reference to a specific file or directory from a LightningWork and can be used to transfer those files to another LightningWork (one way, from source to destination).
- [Payload](https://lightning.ai/lightning-docs/api_reference/generated/lightning_app.storage.payload.Payload.html#lightning_app.storage.payload.Payload): The Payload object enables transferring of Python objects from one work to another in a similar fashion as Path.
- [Drive](https://lightning.ai/lightning-docs/api_reference/generated/lightning_app.storage.drive.Drive.html#lightning_app.storage.drive.Drive): The Drive object provides a central place for your components to share data. The drive acts as an isolated folder and any component can access it by knowing its name.
- [Path](https://lightning.ai/docs/app/stable/api_reference/generated/lightning.app.storage.path.Path.html#lightning_app.storage.path.Path): The Path object is a reference to a specific file or directory from a LightningWork and can be used to transfer those files to another LightningWork (one way, from source to destination).
- [Payload](https://lightning.ai/docs/app/stable/api_reference/generated/lightning.app.storage.payload.Payload.html#lightning_app.storage.payload.Payload): The Payload object enables transferring of Python objects from one work to another in a similar fashion as Path.
- [Drive](https://lightning.ai/docs/app/stable/api_reference/generated/lightning.app.storage.drive.Drive.html#lightning_app.storage.drive.Drive): The Drive object provides a central place for your components to share data. The drive acts as an isolated folder and any component can access it by knowing its name.
Lightning Apps have built-in support for [adding UIs](https://lightning.ai/lightning-docs/workflows/add_web_ui/) 🎨:
- [StaticWebFrontEnd](https://lightning.ai/lightning-docs/api_reference/generated/lightning_app.frontend.web.StaticWebFrontend.html#lightning_app.frontend.web.StaticWebFrontend): A frontend that serves static files from a directory using FastAPI.
- [StreamlitFrontend](https://lightning.ai/lightning-docs/api_reference/generated/lightning_app.frontend.stream_lit.StreamlitFrontend.html#lightning_app.frontend.stream_lit.StreamlitFrontend): A frontend for wrapping Streamlit code in your LightingFlow.
- [ServeGradio](https://lightning.ai/docs/stable/api_reference/generated/lightning_app.components.serve.gradio_server.ServeGradio.html#servegradio): This class enables you to quickly create a `gradio` based UI for your Lightning App.
- [StaticWebFrontEnd](https://lightning.ai/docs/app/stable/api_reference/generated/lightning.app.frontend.web.StaticWebFrontend.html#lightning_app.frontend.web.StaticWebFrontend): A frontend that serves static files from a directory using FastAPI.
- [StreamlitFrontend](https://lightning.ai/docs/app/stable/api_reference/generated/lightning.app.frontend.stream_lit.StreamlitFrontend.html#lightning_app.frontend.stream_lit.StreamlitFrontend): A frontend for wrapping Streamlit code in your LightingFlow.
- [ServeGradio](https://lightning.ai/docs/app/stable/api_reference/generated/lightning.app.components.serve.gradio_server.ServeGradio.html#servegradio): This class enables you to quickly create a `gradio` based UI for your Lightning App.
[Scheduling](https://lightning.ai/lightning-docs/glossary/scheduling.html) ⏲️: The Lightning Scheduling system makes it easy to schedule your components execution with any arbitrary conditions.
@ -113,8 +113,8 @@ Advanced users who need full control over the environment a LightningWork runs i
Ready to use [built-in components](https://lightning.ai/lightning-docs/api_reference/components.html?highlight=built%20components) 🧱:
- [PopenPythonScript](https://lightning.ai/lightning-docs/api_reference/generated/lightning_app.components.python.popen.PopenPythonScript.html#lightning_app.components.python.popen.PopenPythonScript): This class enables you to easily run a Python Script.
- [ModelInferenceAPI](https://lightning.ai/lightning-docs/api_reference/generated/lightning_app.components.serve.serve.ModelInferenceAPI.html#lightning_app.components.serve.serve.ModelInferenceAPI): This class enables you to easily get your model served.
- [PopenPythonScript](https://lightning.ai/docs/app/stable/api_reference/generated/lightning.app.components.python.popen.PopenPythonScript.html#lightning_app.components.python.popen.PopenPythonScript): This class enables you to easily run a Python Script.
- [ModelInferenceAPI](https://lightning.ai/docs/app/stable/api_reference/generated/lightning.app.components.serve.serve.ModelInferenceAPI.html#lightning_app.components.serve.serve.ModelInferenceAPI): This class enables you to easily get your model served.
# App gallery

View File

@ -5,7 +5,7 @@ __author_email__ = "pytorch@lightning.ai"
__license__ = "Apache-2.0"
__copyright__ = f"Copyright (c) 2022-{time.strftime('%Y')}, {__author__}."
__homepage__ = "https://github.com/Lightning-AI/lightning"
__docs_url__ = "https://pytorch-lightning.readthedocs.io/en/stable/"
__docs_url__ = "https://lightning.ai/docs/pytorch/stable/"
__docs__ = ""
__all__ = [

View File

@ -11,7 +11,7 @@ ______________________________________________________________________
<a href="https://www.pytorchlightning.ai/">Website</a>
<a href="#key-features">Key Features</a>
<a href="#how-to-use">How To Use</a>
<a href="https://pytorch-lightning.readthedocs.io/en/stable/">Docs</a>
<a href="https://lightning.ai/docs/pytorch/stable/">Docs</a>
<a href="#examples">Examples</a>
<a href="#community">Community</a>
<a href="https://lightning.ai/">Lightning AI</a>
@ -27,8 +27,7 @@ ______________________________________________________________________
[![DockerHub](https://img.shields.io/docker/pulls/pytorchlightning/pytorch_lightning.svg)](https://hub.docker.com/r/pytorchlightning/pytorch_lightning)
[![codecov](https://codecov.io/gh/Lightning-AI/lightning/branch/master/graph/badge.svg)](https://codecov.io/gh/Lightning-AI/lightning)
[![ReadTheDocs](https://readthedocs.org/projects/pytorch-lightning/badge/?version=stable)](https://pytorch-lightning.readthedocs.io/en/stable/)
[![Discord](https://img.shields.io/discord/1077906959069626439?style=plastic)](https://discord.gg/VptPCZkGNa)
[![ReadTheDocs](https://readthedocs.org/projects/pytorch-lightning/badge/?version=stable)](https://lightning.ai/docs/pytorch/stable/)[![Discord](https://img.shields.io/discord/1077906959069626439?style=plastic)](https://discord.gg/VptPCZkGNa)
[![license](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://github.com/Lightning-AI/lightning/blob/master/LICENSE)
<!--
@ -65,7 +64,7 @@ Lightning forces the following structure to your code which makes it reusable an
Once you do this, you can train on multiple-GPUs, TPUs, CPUs, IPUs, HPUs and even in 16-bit precision without changing your code!
[Get started in just 15 minutes](https://pytorch-lightning.readthedocs.io/en/latest/starter/introduction.html)
[Get started in just 15 minutes](https://lightning.ai/docs/pytorch/latest/starter/introduction.html)
______________________________________________________________________
@ -205,7 +204,7 @@ trainer.fit(autoencoder, DataLoader(train), DataLoader(val))
## Advanced features
Lightning has over [40+ advanced features](https://pytorch-lightning.readthedocs.io/en/latest/common/trainer.html#trainer-flags) designed for professional AI research at scale.
Lightning has over [40+ advanced features](https://lightning.ai/docs/pytorch/latest/common/trainer.html#trainer-flags) designed for professional AI research at scale.
Here are some examples:
@ -371,7 +370,7 @@ ______________________________________________________________________
The PyTorch Lightning community is maintained by
- [10+ core contributors](https://pytorch-lightning.readthedocs.io/en/latest/governance.html) who are all a mix of professional engineers, Research Scientists, and Ph.D. students from top AI labs.
- [10+ core contributors](https://lightning.ai/docs/pytorch/latest/governance.html) who are all a mix of professional engineers, Research Scientists, and Ph.D. students from top AI labs.
- 680+ active community contributors.
Want to help us build Lightning and reduce boilerplate for thousands of researchers? [Learn how to make your first contribution here](https://devblog.pytorchlightning.ai/quick-contribution-guide-86d977171b3a)

View File

@ -18,7 +18,7 @@ __author_email__ = "pytorch@lightning.ai"
__license__ = "Apache-2.0"
__copyright__ = f"Copyright (c) 2018-{time.strftime('%Y')}, {__author__}."
__homepage__ = "https://github.com/Lightning-AI/lightning"
__docs_url__ = "https://pytorch-lightning.readthedocs.io/en/stable/"
__docs_url__ = "https://lightning.ai/docs/pytorch/stable/"
# this has to be simple string, see: https://github.com/pypa/twine/issues/522
__docs__ = (
"PyTorch Lightning is the lightweight PyTorch wrapper for ML researchers."
@ -42,8 +42,8 @@ Overall, Lightning guarantees rigorously tested, correct, modern best practices
Documentation
-------------
- https://pytorch-lightning.readthedocs.io/en/latest
- https://pytorch-lightning.readthedocs.io/en/stable
- https://lightning.ai/docs/pytorch/en/latest
- https://lightning.ai/docs/pytorch/en/stable
"""
__all__ = [