From c9156757fcfffcbba2be5aea1a0c08899c9bd1ca Mon Sep 17 00:00:00 2001 From: William Falcon Date: Thu, 27 Jun 2019 14:39:11 -0400 Subject: [PATCH] debugging and gpu guide --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 818326317a..c01893c0db 100644 --- a/README.md +++ b/README.md @@ -30,8 +30,8 @@ Keras and fast.ai are too abstract for researchers. Lightning abstracts the full Because you want to use best practices and get gpu training, multi-node training, checkpointing, mixed-precision, etc... for free, but still want granular control of the meat of the training, validation and testing loops. To use lightning do 2 things: -1. [Define a trainer](https://github.com/williamFalcon/pytorch-lightning/blob/master/docs/source/examples/basic_trainer.py) (which will run ALL your models). -2. [Define a model](https://github.com/williamFalcon/pytorch-lightning/blob/master/docs/source/examples/example_model.py). +1. [Define a Trainer](https://github.com/williamFalcon/pytorch-lightning/blob/master/examples/new_project_templates/trainer_cpu_template.py) (which will run ALL your models). +2. [Define a LightningModel](https://github.com/williamFalcon/pytorch-lightning/blob/master/examples/new_project_templates/lightning_module_template.py). ## What are some key lightning features?