7c7e50ca47
* added tpu_id
added tpu_id to mixins
* train on individual tpu
* parallel loader if tpu_id is None
* removed progress_bar_refresh_rate
* chlog
* replaced num_tpu_cores with tpu_cores
* set tpu_id to None if int
* changed num_tpu_cores to tpu_cores in docs
* updated docs
* updated __init__.py
removed self.tpu_id for ParallelLoader
* Update pytorch_lightning/trainer/__init__.py
* check if tpu_cores is a list
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* xla device conditional
* num_tpu_cores deprecation
* removed duplicate warning
* fixed pep8 error
* Revert "removed duplicate warning"
This reverts commit
|
||
---|---|---|
.. | ||
source | ||
.build_docs.sh | ||
Makefile | ||
make.bat | ||
requirements.txt |