From 77d085134b77fc8f0d6ff68ce4ad2a94b9ed47f2 Mon Sep 17 00:00:00 2001 From: William Falcon Date: Fri, 23 Aug 2019 03:45:14 -0400 Subject: [PATCH] Update README.md --- README.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/README.md b/README.md index f52e3010b5..f146856fec 100644 --- a/README.md +++ b/README.md @@ -33,15 +33,13 @@ pip install pytorch-lightning **[View the docs here](https://williamfalcon.github.io/pytorch-lightning/)** ## What is it? -Lightning is a very lightweight wrapper on PyTorch. This means you don't have to learn a new library. It defers core training and validation logic to you and automates the rest. It guarantees tested, correct, modern best practices for the automated parts. +Lightning is a very lightweight wrapper on PyTorch. This means you don't have to learn a new library. To use Lightning, simply refactor your research code into the [LightningModule](https://github.com/williamFalcon/pytorch-lightning#how-do-i-do-use-it) format and Lightning will automate the rest. Lightning guarantees tested, correct, modern best practices for the automated parts. ## Why do I want to use lightning? -When starting a new project the last thing you want to do is recode a training loop, multi-cluster training, 16-bit precision, early-stopping, model loading/saving, when to validate, etc... You're likely to spend a long time ironing out all the bugs without even getting to the core of your research. +Every research project starts the same, a model, a training loop, validation loop, etc. As your research advances, you're likely to need distributed training, 16-bit precision, checkpointing, gradient accumulation, etc. -With lightning, you guarantee those parts of your code work so you can focus on the meat of the research: The data and the training/validation loop logic. - -Don't worry about training on multiple gpus or speeding up your code, lightning will do that for you! +Lightning sets up all the boilerplate state-of-the-art training for you so you can focus on the research. --- ## README Table of Contents @@ -59,6 +57,8 @@ Don't worry about training on multiple gpus or speeding up your code, lightning --- ## How do I do use it? +Think about Lightning as refactoring your research code instead of using a new framework. + The research code goes into a [LightningModule]((https://williamfalcon.github.io/pytorch-lightning/LightningModule/RequiredTrainerInterface/)) which you fit using a Trainer. Think of the LightningModule as a *system* such as seq-2-seq, GAN, etc... However, the LightningModule can ALSO just be a simple classifier such as the example below.