fix tpu docs (#886)
This commit is contained in:
parent
3562aa5aae
commit
9571de8757
|
@ -147,6 +147,7 @@ To train on more than 8 cores, your code actually doesn't change!
|
||||||
All you need to do is submit the following command:
|
All you need to do is submit the following command:
|
||||||
|
|
||||||
.. code-block:: bash
|
.. code-block:: bash
|
||||||
|
|
||||||
$ python -m torch_xla.distributed.xla_dist
|
$ python -m torch_xla.distributed.xla_dist
|
||||||
--tpu=$TPU_POD_NAME
|
--tpu=$TPU_POD_NAME
|
||||||
--conda-env=torch-xla-nightly
|
--conda-env=torch-xla-nightly
|
||||||
|
|
Loading…
Reference in New Issue