Added advise for num_workers=0 in docs/speed (#10215)

This commit is contained in:
Nesqulck 2021-10-29 10:58:42 +02:00 committed by GitHub
parent 5f4ffdee41
commit ff2d7e8115
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 10 additions and 0 deletions

View File

@ -145,6 +145,16 @@ some references, [`1 <https://discuss.pytorch.org/t/guidelines-for-assigning-num
The best thing to do is to increase the ``num_workers`` slowly and stop once you see no more improvement in your training speed.
For debugging purposes or for dataloaders that load very small datasets, it is desirable to set ``num_workers=0``. However, this will always log a warning for every dataloader with ``num_workers <= min(2, os.cpu_count())``. In such cases, you can specifically filter this warning by using:
.. code-block:: python
import warnings
warnings.filterwarnings(
"ignore", ".*does not have many workers. Consider increasing the value of the `num_workers` argument*"
)
Spawn
"""""
When using ``strategy=ddp_spawn`` or training on TPUs, the way multiple GPUs/TPU cores are used is by calling ``.spawn()`` under the hood.