* tpu device check
* replaced with xmp spawn
* Revert "replaced with xmp spawn"
This reverts commit 6835380f
* replaced all instances of XLA_AVAILABLE
* moved inner_f to global scope
* made refactors
* added changelog
* added TPU_AVAILABLE variable
* fix codefactor issues
* removed form trainer and early stopping
* add TORCHXLA_AVAILABLE check
* added tests
* refactoring
* Update pytorch_lightning/utilities/xla_device_utils.py
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
* updated function names
* fixed bug
* updated CHANGELOG.md
* added todo
* added type hints
* isort and black
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: William Falcon <waf2107@columbia.edu>
* make current_epoch and global_step to be same as trainer, after model restore.
* remove assignment here
* test
* minor modification
* Update pytorch_lightning/core/lightning.py
type check, better clarity
Co-authored-by: ananthsub <ananth.subramaniam@gmail.com>
* Update pytorch_lightning/core/lightning.py
type check, better clarity
Co-authored-by: ananthsub <ananth.subramaniam@gmail.com>
* comments for current_epoch and global_step properties
* Update tests/models/test_restore.py
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* update comments according to the changes made
* Update tests/models/test_restore.py
* add current_epoch, global_step to jit ignore list
* Add comments to CHANGELOG
* Update CHANGELOG.md
* Update tests/models/test_restore.py
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: ananthsub <ananth.subramaniam@gmail.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* refactored callback system and init ddp
* refactored callback system and init ddp
* refactored callback system and init ddp
* refactored callback system and init ddp
* ref: test val epoch end
* ref: test val epoch end
* ref: test val epoch end
* ref: test log dict
* ref: test log dict
* ref: test log dict
* ref: test log dict
* Split out changes from #3563 to make that PR easier to review. This formats the file according to the Black formatter
* Store a reference to the trainer on the datamodule
Fixes#3682
* Update data_connector.py
* Update data_connector.py
* Update test_datamodules.py
* Split out changes from #3563 to make that PR easier to review. This formats the file according to the Black formatter
* support checkpoint hooks for datamodule
refactor on_{save/load}_checkpoint to a separate hook class that both the lightning module and data module inherit
add spots in callback connector to call new datamodule hooks if available
* hooks formatting
* Update hooks.py
* Update checkpoint_connector.py
* Update lightning.py
* update based on upstream/master
checkout upstream/master
* Update checkpoint_connector.py
* add tests
* undo format revert
* Updated CHANGELOG.md
* add checkpoint hooks
* add Dict type
* import CheckpointHooks
* ref: test val epoch end
* ref: test val epoch end
* ref: test val epoch end
* ref: test val epoch end
* ref: test val epoch end
* ref: test val epoch end
* enable pt 1.7
* readme
* nightly diff version testing, will delete later
* nightly diff version testing, will delete later
* back to normal [ci skip]
* use __ignored_properties__
* define __ignored_properties__ in respective modules
* change log
* formatting
Co-authored-by: Jirka Borovec <jirka@pytorchlightning.ai>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* Split out changes from #3563 to make that PR easier to review. This formats the file according to the Black formatter
* support checkpoint hooks for datamodule
refactor on_{save/load}_checkpoint to a separate hook class that both the lightning module and data module inherit
add spots in callback connector to call new datamodule hooks if available
* hooks formatting
* Update hooks.py
* Update hooks.py
* Update checkpoint_connector.py
* Update lightning.py
* update based on upstream/master
checkout upstream/master
* Update lightning.py
* Update lightning.py
Co-authored-by: William Falcon <waf2107@columbia.edu>
* Black format pytorch_lightning/core/hooks.py
Split out changes from #3563 to make that PR easier to review. This formats the file according to the Black formatter
* Split out changes from #3563 to make that PR easier to review. This formats the file according to the Black formatter
* script
* docs
* simple test
* move test
* fix doctest
* no grad context
* extend tests
test
test
* datamodule test
* clean up test
* docs
* name
* fix import
* update changelog
* fix import
* skip pytorch 1.3 in test
* update codeblock
* skip bugged 1.4
* typehints
* doctest not working on all pytorch versions
* rename TestGAN to prevent pytest interference
* add note about pytorch version
* fix torchscript version inconsistency in tests
* reset training state + tests
* update docstring
* Apply suggestions from code review
Co-authored-by: Justus Schock <12886177+justusschock@users.noreply.github.com>
* update docstring, dict return
* add docs to index
* add link
* doc eval mode
* forward
* optional save to file path
* optional
* test torchscript device
* test save load with file path
* pep
* str
* Commit typing suggestion
Co-authored-by: ananthsub <ananth.subramaniam@gmail.com>
* skip test if cuda not available
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Justus Schock <12886177+justusschock@users.noreply.github.com>
Co-authored-by: ananthsub <ananth.subramaniam@gmail.com>
* add ddp script variations
* add ddp test
* rename
* shell
* test
* test
* try call
* try without subprocess
* test
* display the error
* list all variations
* try string
* try copy env
* debug
* pythonpath
* path
* update test
* change
* simple ddp test
* replace
* remove random port
* random port
* str
* clean up
* check run spawn
* clean up
* docs
* docs
* update test
* docs
* changelog
* changelog
* Override the default gather method to support scalars
* add computing average of a list
* bug: change if to elif
* add some tests
* change style
* change documentation
* use apply_to_collection in DP gather
* use apply_to_collection in DP gather
* fix warning msg
* override gather method in DP
* add tests for python scalars
* add python scalars to docstring
* Update message
* override gather method in DP
* formatting
* chlog
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Jirka Borovec <jirka@pytorchlightning.ai>
* export model to onnx
* prepare data before exporting
* support for dataloaders and tensors
* added tests
* use example_input_array
add to changelog
* updated docstring
* added onnx inference tests
* temp commit
* removed schema valid test
* add onnxruntime to environment.yml
* moved onnxruntime to environment.yml pip
* add example in doc
* add lines between code block
* added PR to changelog
* is file check
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* remove *
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* infer example outputs
* added doctest for onnx
* fix windows tests
* moved eval within condition block
* self.forward to self
* added docs
* fixed docs error
* added to toctree
* Update CHANGELOG.md
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
* truncate version number
* add docs and example
* extend docs
* docs
* docs
* changelog
* show last
* Update pytorch_lightning/core/lightning.py
* Update pytorch_lightning/core/lightning.py
Co-authored-by: William Falcon <waf2107@columbia.edu>
* recursive dtype device apply
* simplify
* simple test
* submodule test
* rename
* explicit
* type hints
* test for dp backend
* fix test skip
* rename
* add ddp_spawn test
* fix None index in test
* try fix ddp_spawn test
* changelog
* move _dtype and _device to mixin
* additional doctest