Nicki Skafte
e0b856c105
[Metrics] Confusion matrix class interface ( #4348 )
...
* docs + precision + recall + f_beta + refactor
Co-authored-by: Teddy Koker <teddy.koker@gmail.com>
* rebase
Co-authored-by: Teddy Koker <teddy.koker@gmail.com>
* fixes
Co-authored-by: Teddy Koker <teddy.koker@gmail.com>
* added missing file
* docs
* docs
* extra import
* add confusion matrix
* add to docs
* add test
* pep8 + isort
* update tests
* move util function
* unify functional and class
* add to init
* remove old implementation
* update tests
* pep8
* add duplicate
* fix doctest
* Update pytorch_lightning/metrics/classification/confusion_matrix.py
Co-authored-by: Justus Schock <12886177+justusschock@users.noreply.github.com>
* changelog
* bullet point args
* bullet docs
* bullet docs
Co-authored-by: ananyahjha93 <ananya@pytorchlightning.ai>
Co-authored-by: Teddy Koker <teddy.koker@gmail.com>
Co-authored-by: Justus Schock <12886177+justusschock@users.noreply.github.com>
Co-authored-by: chaton <thomas@grid.ai>
Co-authored-by: Roger Shieh <55400948+s-rog@users.noreply.github.com>
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
2020-10-30 11:44:25 +01:00
Adrian Wälchli
d1234c592d
deprecate passing ModelCheckpoint instance to Trainer(checkpoint_callback=...) ( #4336 )
...
* first attempt
* update tests
* support multiple
* test bugfix
* changelog
* pep
* pep
* import order
* import
* improve test for resuming
* test
* update test
* add references test
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
* docstring suggestion deprecation
Co-authored-by: Jeff Yang <ydcjeff@outlook.com>
* paramref
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Co-authored-by: Jeff Yang <ydcjeff@outlook.com>
2020-10-30 04:47:37 +01:00
Martin Hwang
b459fd26ac
fix: `nb` is set total number of devices, when nb is -1. ( #4209 )
...
* fix: `nb` is set total number of devices, when nb is -1.
Refs: #4207
* feat: add test code
1. test combination `auto_select_gpus`, `gpus` options using
Trainer
2. test `pick_multiple_gpus` function directly
Refs: #4207
* docs: modify contents in `Select GPU devices`
Refs: #4207
* refactore: reflect the reuslt of review
Refs: #4207
* refactore: reflect the reuslt of review
Refs: #4207
* Update CHANGELOG.md
Co-authored-by: chaton <thomas@grid.ai>
Co-authored-by: Roger Shieh <55400948+s-rog@users.noreply.github.com>
Co-authored-by: Nicki Skafte <skaftenicki@gmail.com>
2020-10-29 10:50:37 +01:00
Boris Dayma
ff41d80706
feat(wandb): log in sync with Trainer step ( #4405 )
...
* feat(wandb): log in sync with Trainer step
* docs: update CHANGELOG
* style(test_wandb): fix formatting
* parentheses
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
2020-10-29 01:07:06 +05:30
Jeremy Jordan
1e1a42260a
add option to log momentum ( #4384 )
...
* add option to log momentum
* add docstring
* refactor for cleanliness
Co-authored-by: Nicki Skafte <skaftenicki@gmail.com>
2020-10-28 21:56:58 +05:30
Rohit Gupta
b26c71eadf
Add optimizer hooks in callbacks ( #4379 )
...
* Add optimizer hooks in callbacks
* optimizer param
* update test
Co-authored-by: Nicki Skafte <skaftenicki@gmail.com>
2020-10-28 13:15:22 +01:00
Carlos Mocholí
00cc69aed7
Add "monitor" to saved ModelCheckpoints ( #4383 )
...
* Add key
* Remove unused variables
* Update CHANGELOG [skip ci]
* best_model_monitor -> monitor
Co-authored-by: Sean Naren <sean.narenthiran@gmail.com>
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
2020-10-28 15:21:08 +05:30
Alexander
4106e2f112
Fix COMET_EXPERIMENT_KEY environment variable usage in comet logger ( #4230 )
...
* Fix COMET_EXPERIMENT_KEY environment variable usage
* Remove unused arg
* Update comet.py
* Add test by Lothiraldan
* remove blank
Co-authored-by: chaton <thomas@grid.ai>
Co-authored-by: Nicki Skafte <skaftenicki@gmail.com>
Co-authored-by: Sean Naren <sean.narenthiran@gmail.com>
2020-10-27 14:30:56 +00:00
ananthsub
6878f3bf4e
Enable DDP Plugin to pass through args to LightningDistributedDataParallel ( #4382 )
...
* Update ddp_plugin.py
* Update ddp_plugin.py
* Update ddp_plugin.py
* Update test_ddp_plugin.py
* Update pytorch_lightning/plugins/ddp_plugin.py
* Update pytorch_lightning/plugins/ddp_plugin.py
* Fixed imports, make ddp_kwargs protected
Co-authored-by: SeanNaren <sean.narenthiran@gmail.com>
2020-10-27 12:27:59 +00:00
Dusan Drevicky
c50c225f05
feature: Allow str arguments in Trainer.profiler ( #3656 )
...
* allow trainer's profiler param to have a str value
* add tests
* update docs
* update exception message
* Update CHANGELOG
* fix pep8 issues
* cleanup test code
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
* Add deprecation warning if using bool for profiler
* Add deprecation tests and move deprecated tests
* Remove bool option to profiler from docs
* Deprecate bool args to profiler in CHANGELOG
* fixup! Add deprecation warning if using bool for profiler
* fixup! Add deprecation tests and move deprecated tests
* Apply suggestions from code review
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
* Implement suggestions, remove whitespace
* fixup! Implement suggestions, remove whitespace
* Allow bool, str (case insensitive), BaseProfiler
* Add info about bool deprecation to trainer
* fixup! Add info about bool deprecation to trainer
* Move deprecate todo to test_deprecated
* Test wrong profiler type, improve error message
* fixup! Test wrong profiler type, improve error message
* Update pytorch_lightning/trainer/connectors/profiler_connector.py
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
* Apply suggestions from code review
* Readd bool to profiler types, test cli profiler arg
* Remove extra whitespace in doc
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
* Apply suggestions from code review
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
* Update deprecation versions
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
2020-10-27 16:27:16 +05:30
Chenglu
8e3faa2da1
get help from docstring ( #4344 )
...
* Add geting help message from docstring
* Fix pep8 issue
* Apply suggestions from code review
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
* Apply suggestions from code review
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Sean Naren <sean.narenthiran@gmail.com>
2020-10-26 23:38:58 +05:30
chaton
f07ee33db6
BUG - Wandb: Sanitize callable. ( #4320 )
...
* add _sanitize_callable_params
* add call on _val if callable
* clean code formatter
* resolve pep8
* default return function name
* resolve pep8
* Apply suggestions from code review
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
* Update CHANGELOG.md
Co-authored-by: Sean Naren <sean.narenthiran@gmail.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
2020-10-26 11:57:03 +00:00
William Falcon
98205fb438
Enable custom apex and amp plugins ( #4355 )
...
* enable custom apex, amp plugin
* enable custom apex, amp plugin
* enable custom apex, amp plugin
* enable custom apex, amp plugin
2020-10-25 17:11:07 -04:00
Dusan Drevicky
6ad299573f
[Metrics] Fix/4237 auc unstable reorder ( #4281 )
...
* =Add deprecation warning for auc reorder
* =Add test for deprecation warning for auc reorder
* Update CHANGELOG
* Add reorder deprecation warning to auc docstring
* Fix pep8 f-string error
* remove duplicate import
Co-authored-by: Nicki Skafte <skaftenicki@gmail.com>
2020-10-25 10:26:40 +01:00
ananthsub
f6efb712ed
Skip replacing dataloader sampler if it's already a distributed sampler ( #4273 )
...
* Update data_loading.py
* Update data_loading.py
* add test + update flag description
* add to changelog
* Update test_dataloaders.py
* fix-pickle
* Update test_dataloaders.py
* Added missing reference calls
* Update tests/trainer/test_dataloaders.py
* Apply suggestions from code review
* Update data_loading.py
* Update test_dataloaders.py
Co-authored-by: Sean Naren <sean.narenthiran@gmail.com>
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
2020-10-23 17:34:07 +01:00
chaton
3abfec8962
[HOTFIX] ModelCheckpoint - Don't increase current_epoch and global_step if not trained ( #4291 )
...
* add two tests w/wo tempdir
* resolve flake8
* this test is failing
* update bug report
* resolve bug and add test
* remove bug_report
* resolve flake8
* resolve bug
* resolve pep8
* resolve pep8
Co-authored-by: Teddy Koker <teddy.koker@gmail.com>
2020-10-23 11:17:50 +01:00
Rohit Gupta
4c7ebdc32b
Add dirpath and filename parameter in ModelCheckpoint ( #4213 )
...
* Add dirpath and filename parameter in ModelCheckpoint
* remove old function
* chlog
* codefactor
* update tests
* docs
* fix doctest and added tests
* pathlib dirpath
* dep version and docs
* try fix doctest
* pep
* suggestions
Co-authored-by: carmocca <carlossmocholi@gmail.com>
* suggestions
* fix test
* pep
* trigger tests
* Apply suggestions from code review
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
* suggestions
* try fix windows test
* add and update some tests
* trigger tests
* Apply suggestions from code review
* Apply suggestions from code review
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: William Falcon <waf2107@columbia.edu>
2020-10-23 09:59:12 +05:30
Sean Naren
065cc94112
Fix bug comparing max_steps to global step which inits at 0 ( #4278 )
...
* Fix bug comparing max_steps to global step which inits at 0
* Added test to ensure accumulate grad batch works with max steps
* check fix with TODO test
* correct call counts
* Add check to ensure we've finished accumulation of this global step before exiting loop in conjuction with max steps
* Remove + 1 check in test as this was incorrect
* Update incorrect expected outputs in lr finder test
* Added brackets for clarity
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
2020-10-22 13:58:59 +01:00
Rohit Gupta
af449310aa
limit monitor callback with log_every_n_steps ( #3881 )
...
* limit monitor callback with row_log_interval
* try fix gpu test
* log_every_n_steps
* Apply suggestions from code review
* Apply suggestions from code review
* rebase and staticmethod
* suggestions
Co-authored-by: Jeff Yang <ydcjeff@outlook.com>
2020-10-22 16:38:03 +05:30
William Falcon
753362d0a4
enable ddp as a plugin ( #4285 )
...
* enable custom ddp plugin
* enable custom ddp plugin
* enable custom ddp plugin
* enable custom ddp plugin
* enable custom ddp plugin
* enable custom ddp plugin
* enable custom ddp plugin
* enable custom ddp plugin
* enable custom ddp plugin
* enable custom ddp plugin
* enable custom ddp plugin
Co-authored-by: chaton <thomas@grid.ai>
2020-10-22 05:15:51 -04:00
Nicki Skafte
a937394312
[Metrics] Unification of regression ( #4166 )
...
* moved to utility
* add files
* unify
* add desc
* update
* end of line
* Apply suggestions from code review
Co-authored-by: Justus Schock <12886177+justusschock@users.noreply.github.com>
* Apply suggestions from code review
Co-authored-by: Justus Schock <12886177+justusschock@users.noreply.github.com>
* add back functional test in new interface
* pep8
* doctest fix
* test name fix
* unify psnr + add class psnr, TODO: psnr test refactor ala mean squared error
* unify psnr
* rm unused code
* pep8
* docs
* unify ssim
* lower tolerance for ssim
* fix import
* pep8
* docs
* flake8
* test smaller images
* trying to fix test
* no ddp test for ssim
* pep8
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Justus Schock <12886177+justusschock@users.noreply.github.com>
Co-authored-by: Teddy Koker <teddy.koker@gmail.com>
2020-10-21 18:05:59 -04:00
Mauricio Villegas
546476c704
Allow changing the logged step value in validation_step ( #4130 )
...
* Fix to bug identified in https://github.com/PyTorchLightning/pytorch-lightning/issues/4102
* update tests
* chlog
Co-authored-by: rohitgr7 <rohitgr1998@gmail.com>
2020-10-22 03:03:07 +05:30
Carlos Mocholí
2549ca40e6
Clean up optimizer code ( #3587 )
...
* Update optimizer code
* Update CHANGELOG
* Fix tuple of one list case
* Update docs
* Fix pep issue
* Minor typo [skip-ci]
* Use minimal match
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
* Apply suggestions from code review
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
2020-10-21 21:12:48 +02:00
Justus Schock
0ec4107697
Optimizer closure ( #4190 )
...
* closure for all optimizers
* rename hook and take care of alternating backwards
* add comment
* training_loop_fix
* closure whenever possible
* training_loop
* simple tests that count backward calls
* fix test to work with closure
* remove debugging statement
* better place
* check grads after backward
* start fixing manual optimization
* skip step when result returned by closure was None
* fix gradient clipping test to work with closure
* attribute dict result only for automatic optimization
* adjust backward calls in accelerator
* adjust where to call gradient clipping
* adjust backward calls in tests
* Apply suggestions from code review
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
* pass kwargs to xla optimizer
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
2020-10-21 19:34:29 +01:00
William Falcon
8a20d6af51
make save fx part of model checkpoint cb ( #4284 )
2020-10-21 10:06:42 -04:00
Dusan Drevicky
96785b99df
Feature/4244 iou input expectations ( #4261 )
...
* =Add iou input checks
* =Add test for iou input checks
* =Update docstring for iou pred and target
2020-10-21 09:01:24 -04:00
Carlos Mocholí
e0f9799dbf
Add strict option to lr_scheduler dict ( #3586 )
...
* Add strict option to lr_scheduler dict
* Update docs
* Unnecessary "else" after "raise"
* Update CHANGELOG
* Fix rebase
2020-10-21 14:14:37 +02:00
Abhinav Gupta
5d1583d7f2
Corrected f_beta computation ( #4183 )
...
* Update f_beta.py
Added METRIC_EPS in the denominator to avoid nan values in f_beta score.
* Update f_beta.py
Made changes flake8 compliant
* Update f_beta.py
Makes use of class_reduce for macro f_beta computation to avoid nans
* Update f_beta.py
Made flake8 compliant
* Corrected F beta computation
* Removed offset to make the computation precise
2020-10-21 10:52:04 +02:00
Sean Naren
c336881959
Added fix to ensure that custom logged metrics within test_epoch_end are appended to the result object even without step reduced metrics ( #4251 )
2020-10-20 18:33:18 +02:00
Jirka Borovec
3777988502
add test for model hooks ( #4010 )
2020-10-20 13:33:46 +01:00
Jirka Borovec
f37444fa3e
CI: add flake8 ( #4239 )
2020-10-19 21:20:17 +01:00
Teddy Koker
827a557269
Add persistent flag to Metric.add_state ( #4195 )
...
* add persistant flag to add_state in metrics
* wrap register_buffer with try catch
* pep8
* use loose version
* test
* pep8
2020-10-16 14:36:03 -04:00
Jirka Borovec
3fe479f348
fix hparams assign in init ( #4189 )
2020-10-16 13:57:21 +01:00
Jirka Borovec
4204ef7b53
Bugfix/4156 filter hparams for yaml - fsspec ( #4158 )
...
* add test
* fix
* sleepy boy
* chlog
* Apply suggestions from code review
Co-authored-by: Justus Schock <12886177+justusschock@users.noreply.github.com>
Co-authored-by: Justus Schock <12886177+justusschock@users.noreply.github.com>
2020-10-15 16:53:42 +02:00
William Falcon
72f19768c8
remove duplicate metric vs step log for train loop ( #4173 )
...
* remove duplicate metric vs step log
* remove duplicate metric vs step log
* remove duplicate metric vs step log
* fix ddp index issue
2020-10-15 10:47:00 -04:00
William Falcon
45d05ff68d
Fixes #4141 ( #4169 )
...
* fix val epoch agg
* fix val agg metrics
* fix val agg metrics
* fix val agg metrics
2020-10-15 09:12:05 -04:00
Jirka Borovec
f064682786
save initial arguments ( #4163 )
...
* save initial arguments
* typing
* chlog
* .
2020-10-15 08:30:49 -04:00
Jirka Borovec
4290c9e1e8
Tests: clean metrics ( #4152 )
...
* namme inputs
* sk rename
* imports
2020-10-15 07:50:08 +02:00
Nicki Skafte
edcb6e49b9
Speedup of metric tests ( #4122 )
...
* speedup
* something working
* update the rest
* more desc
* recurse tests/metrics again
* pep8
Co-authored-by: Teddy Koker <teddy.koker@gmail.com>
2020-10-14 13:51:58 -04:00
Jirka Borovec
2c3b512c50
reverted "temporary drop metrics tests while speeding them up" and SKIP ( #4115 )
...
* Revert "temporary drop metrics tests while speeding them up (#4071 )"
This reverts commit 86c70622fb
.
* skip metrics tests
* skipping
2020-10-14 19:01:43 +02:00
Stef | ステフ
fa737a5eb9
Add trace functionality to the function to_torchscript ( #4142 )
...
* Add trace functionality to the function to_torchscript
* used wrong parameter name in test
* fix indentation to confirm to code style
2020-10-14 09:20:52 -04:00
Sean Naren
98eb736496
Added getstate/setstate method for torch.save serialization ( #4127 )
...
* Added getstate/setstate method for torch.save serialization, added additional Optional Typing to results object
* Added tests to ensure torch.save does not fail
* Added flags to ensure compatible ddp cpu environment
* Removed torch version check due to minimum already being 1.3, reduced epochs for speed
* Moved tests to separate file
* Update to accelerator, move to ddp_spawn to prevent hanging ddp
2020-10-13 16:47:23 -04:00
William Falcon
09c2020a93
notices ( #4118 )
2020-10-13 07:18:07 -04:00
William Falcon
2d5a7f5e7d
Fixes #3276 ( #4116 )
2020-10-13 06:42:11 -04:00
William Falcon
bf2067a609
enabled manual returns ( #4089 )
2020-10-12 10:06:17 -04:00
William Falcon
5b645d713e
Covv1 ( #4072 )
...
* temporary drop metrics tests while speeding them up
* cov
* cov
* docs
2020-10-11 10:21:53 -04:00
William Falcon
86c70622fb
temporary drop metrics tests while speeding them up ( #4071 )
2020-10-11 07:38:58 -04:00
William Falcon
7ffe05a3d1
ref: accelerator names ( #4066 )
...
* ref: accelerator names
* docs
2020-10-11 01:05:14 -04:00
William Falcon
a4b9221fc5
ref: decouple apex second attemp part n/n ( #4065 )
...
* ref: decouple apex second attemp part n/n
* ref: decouple apex second attemp part n/n
2020-10-10 22:04:50 -04:00
William Falcon
dbfe2b6129
ref: decouple apex second attemp part 9/n ( #4063 )
...
* ref: decouple apex second attemp part 9/n
* ref: decouple apex second attemp part 9/n
2020-10-10 18:44:24 -04:00