2023-01-10 19:11:03 +00:00
################
Fabric Utilities
################
seed_everything
===============
2023-01-12 13:37:24 +00:00
This function sets the random seed in important libraries.
2023-01-25 10:45:09 +00:00
In a single line of code, you can seed PyTorch, NumPy, and Python:
2023-01-12 13:37:24 +00:00
.. code-block :: diff
+ from lightning.fabric import seed_everything
seed = 42
- random.seed(seed)
- numpy.random.seed(seed)
- torch.manual_seed(seed)
- torch.cuda.manual_seed(seed)
+ seed_everything(seed)
The same is also available as a method on the Fabric object if you don't want to import it separately:
2023-01-10 19:11:03 +00:00
.. code-block :: python
2023-01-12 13:37:24 +00:00
from lightning.fabric import Fabric
fabric.Fabric()
fabric.seed_everything(42)
2023-01-25 10:45:09 +00:00
In distributed settings, you may need to set a different seed per process, depending on the application.
2023-01-12 13:37:24 +00:00
For example, when generating noise or data augmentations. This is very straightforward:
.. code-block :: python
fabric = Fabric(...)
fabric.seed_everything(seed + fabric.global_rank)
2023-03-07 15:43:47 +00:00
By default, :meth: `~lightning.fabric.fabric.Fabric.seed_everything` also handles the initialization of the seed in :class: `~torch.utils.data.DataLoader` worker processes:
2023-01-12 13:37:24 +00:00
.. code-block :: python
fabric = Fabric(...)
# By default, we handle DataLoader workers too:
fabric.seed_everything(..., workers=True)
# Can be turned off:
fabric.seed_everything(..., workers=False)
----
print
=====
2023-02-27 20:14:23 +00:00
Avoid duplicated print statements in the logs in distributed training by using Fabric's :meth: `~lightning.fabric.fabric.Fabric.print` method:
2023-01-12 13:37:24 +00:00
.. code-block :: python
print("This message gets printed in every process. That's a bit messy!")
fabric = Fabric(...)
fabric.print("This message gets printed only in the main process. Much cleaner!")
2023-03-14 13:03:38 +00:00
----
is_wrapped
==========
You can test whether an object (:class: `~torch.nn.Module` , :class: `~torch.optim.Optimizer` , :class: `~torch.utils.data.DataLoader` ) was already set up once by Fabric:
.. code-block :: python
from lightning.fabric import is_wrapped
if not is_wrapped(model):
model = fabric.setup(model)
if not is_wrapped(train_dataloader):
train_dataloader = fabric.setup_dataloaders(train_dataloader)