* xfail if not installed
include mkpatch
fix test
* mock comet
comet mocks
fix test
remove dep
undo merge duplication
* line
* line
* convert doctest
* doctest
* docs
* Added test for logging in validation step when using dict dataset with string value
* fix recursive issue
* fix recursive issue
Co-authored-by: Nathan Painchaud <nathanpainchaud@gmail.com>
Co-authored-by: William Falcon <waf2107@columbia.edu>
* tpu device check
* replaced with xmp spawn
* Revert "replaced with xmp spawn"
This reverts commit 6835380f
* replaced all instances of XLA_AVAILABLE
* moved inner_f to global scope
* made refactors
* added changelog
* added TPU_AVAILABLE variable
* fix codefactor issues
* removed form trainer and early stopping
* add TORCHXLA_AVAILABLE check
* added tests
* refactoring
* Update pytorch_lightning/utilities/xla_device_utils.py
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
* updated function names
* fixed bug
* updated CHANGELOG.md
* added todo
* added type hints
* isort and black
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: William Falcon <waf2107@columbia.edu>
* Makes sure logging doesn't ever happen from non-root zero
* Makes sure logging doesn't ever happen from non-root zero
* Makes sure logging doesn't ever happen from non-root zero
* added bug report model
* fix local model
* fix local model
* fix local model
* fix local model
* Fix.
* Fix#2550: allow to load model from checkpoint if self.save_hyperparameters() was not called.
* Fix? Cleaner way of not calling self.save_hyperparameters in EvalModelTemplate.
* Fix? `_load_model_state` cleanup
* Fix?
* Fix#2550: allow to load model from checkpoint if self.save_hyperparameters() was not called.
* Fix.
* Fix? Cleaner way of not calling self.save_hyperparameters in EvalModelTemplate.
* Fix? `_load_model_state` cleanup
* Fixed side effect in `test_load_model_from_checkpoint_extra_args`.
* Apply suggestions from code review
* fix
* try
* fixed missing arg in evalmodel
* fixed missing arg in evalmodel
* fix
* update
* fix loading
* add test
* prune
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Jirka Borovec <jirka@pytorchlightning.ai>
Co-authored-by: William Falcon <waf2107@columbia.edu>
* make current_epoch and global_step to be same as trainer, after model restore.
* remove assignment here
* test
* minor modification
* Update pytorch_lightning/core/lightning.py
type check, better clarity
Co-authored-by: ananthsub <ananth.subramaniam@gmail.com>
* Update pytorch_lightning/core/lightning.py
type check, better clarity
Co-authored-by: ananthsub <ananth.subramaniam@gmail.com>
* comments for current_epoch and global_step properties
* Update tests/models/test_restore.py
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* update comments according to the changes made
* Update tests/models/test_restore.py
* add current_epoch, global_step to jit ignore list
* Add comments to CHANGELOG
* Update CHANGELOG.md
* Update tests/models/test_restore.py
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: ananthsub <ananth.subramaniam@gmail.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* test selecting the correct backend. tem backends while slurm and TE are decoupled
* test selecting the correct backend. tem backends while slurm and TE are decoupled