lightning/pytorch_lightning/core
Sean Naren 742c48e994
[Fix] Ensure we set the eval/train flag correctly on accelerator model (#6877)
* Ensure we move the model to eval mode before running evaluation

* Ensure we set the flag appropriately across all stages

* Add test, move hooks logic

* Apply same fix to the validate loop

* Update pytorch_lightning/trainer/trainer.py

* Fix function name

* Fix order, add predict

* Shorten the name

* Fix input dm, drop duplicate on predict start hook call, as it's called in the setup function

* Use hook, remove double call
2021-04-08 14:04:26 -04:00
..
__init__.py CI: add flake8 (#4239) 2020-10-19 21:20:17 +01:00
datamodule.py Support teardown hook on DataModule (#4673) 2021-03-25 07:51:55 -05:00
decorators.py Add before_batch_transfer and after_batch_transfer hooks (#3671) 2021-02-18 06:58:12 -05:00
grads.py formatting 5/n: Core (#5721) 2021-02-08 14:29:43 -05:00
hooks.py [Fix] Ensure we set the eval/train flag correctly on accelerator model (#6877) 2021-04-08 14:04:26 -04:00
lightning.py Simplify deprecations (#6620) 2021-03-25 15:26:38 +01:00
memory.py Handle torch.jit scripted modules in layer summary (#6511) 2021-03-15 03:17:42 +01:00
optimizer.py [bugfix] Check LightningOptimizer doesn't delete optimizer hooks (#6305) 2021-03-04 20:11:59 +00:00
saving.py Fix csv extension check (#6436) 2021-04-08 01:16:31 +00:00
step_result.py Remove legacy support for the magic `log`/`progress_bar` keys in dict returns (#6734) 2021-03-31 00:28:04 +02:00