updated docs

This commit is contained in:
William Falcon 2019-08-01 10:11:26 -04:00
parent 7b774beb0c
commit 598e1accb5
9 changed files with 12 additions and 12 deletions

View File

@ -4,7 +4,7 @@
</a>
</p>
<h3 align="center">
Pytorch Lightning
PyTorch Lightning
</h3>
<p align="center">
The Keras for ML researchers using PyTorch. More control. Less boilerplate.

View File

@ -300,7 +300,7 @@ def tng_dataloader(self)
Called by lightning during training loop. Make sure to use the @ptl.data_loader decorator, this ensures not calling this function until the data are needed.
##### Return
Pytorch DataLoader
PyTorch DataLoader
**Example**
@ -327,7 +327,7 @@ def tng_dataloader(self)
Called by lightning during validation loop. Make sure to use the @ptl.data_loader decorator, this ensures not calling this function until the data are needed.
##### Return
Pytorch DataLoader
PyTorch DataLoader
**Example**
@ -355,7 +355,7 @@ def test_dataloader(self)
Called by lightning during test loop. Make sure to use the @ptl.data_loader decorator, this ensures not calling this function until the data are needed.
##### Return
Pytorch DataLoader
PyTorch DataLoader
**Example**

View File

@ -31,7 +31,7 @@ y_hat = pretrained_model(x)
| Param | description |
|---|---|
| weights_path | Path to a pytorch checkpoint |
| weights_path | Path to a PyTorch checkpoint |
| tags_csv | Path to meta_tags.csv file generated by the test-tube Experiment |
| on_gpu | if True, puts model on GPU. Make sure to use transforms option if model devices have changed |
| map_location | A dictionary mapping saved weight GPU devices to new GPU devices |

View File

@ -52,7 +52,7 @@ Trainer(experiment=exp)
---
### Tensorboard support
The experiment object is a strict subclass of Pytorch SummaryWriter. However, this class
The experiment object is a strict subclass of PyTorch SummaryWriter. However, this class
also snapshots every detail about the experiment (data folder paths, code, hyperparams),
and allows you to visualize it using tensorboard.
``` {.python}

View File

@ -5,7 +5,7 @@ There are cases when you might want to do something different at different parts
To enable a hook, simply override the method in your LightningModule and the trainer will call it at the correct time.
**Contributing** If there's a hook you'd like to add, simply:
1. Fork PytorchLightning.
1. Fork PyTorchLightning.
2. Add the hook [here](https://github.com/williamFalcon/pytorch-lightning/blob/master/pytorch_lightning/root_module/hooks.py).
3. Add the correct place in the [Trainer](https://github.com/williamFalcon/pytorch-lightning/blob/master/pytorch_lightning/models/trainer.py) where it should be called.

View File

@ -64,7 +64,7 @@ But of course the fun is in all the advanced things it can do:
- [Gradient Clipping](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#gradient-clipping)
- [Hooks](hooks)
- [Learning rate scheduling](https://williamfalcon.github.io/pytorch-lightning/LightningModule/RequiredTrainerInterface/#configure_optimizers)
- [Use multiple optimizers (like GANs)](https://williamfalcon.github.io/pytorch-lightning/Pytorch-Lightning/LightningModule/#configure_optimizers)
- [Use multiple optimizers (like GANs)](https://williamfalcon.github.io/pytorch-lightning/PyTorch-Lightning/LightningModule/#configure_optimizers)
- [Set how much of the training set to check (1-100%)](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#set-how-much-of-the-training-set-to-check)
**Validation loop**

View File

@ -71,7 +71,7 @@ one could be a seq-2-seq model, both (optionally) ran by the same trainer file.
- [Gradient Clipping](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#gradient-clipping)
- [Hooks](https://williamfalcon.github.io/pytorch-lightning/Trainer/hooks/)
- [Learning rate scheduling](https://williamfalcon.github.io/pytorch-lightning/LightningModule/RequiredTrainerInterface/#configure_optimizers)
- [Use multiple optimizers (like GANs)](https://williamfalcon.github.io/pytorch-lightning/Pytorch-Lightning/LightningModule/#configure_optimizers)
- [Use multiple optimizers (like GANs)](https://williamfalcon.github.io/pytorch-lightning/PyTorch-Lightning/LightningModule/#configure_optimizers)
- [Set how much of the training set to check (1-100%)](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#set-how-much-of-the-training-set-to-check)
###### Validation loop

View File

@ -1,10 +1,10 @@
site_name: Pytorch lightning Documentation
site_name: PyTorch lightning Documentation
theme:
name: 'material'
docs_dir: docs
repo_url: https://github.com/williamFalcon/pytorch-lightning
site_dir: 'site'
site_description: 'Documentation for Pytorch LightningModule, the researcher version of keras.'
site_description: 'Documentation for PyTorch LightningModule, the researcher version of keras.'
dev_addr: '0.0.0.0:8000'
#google_analytics: ['UA-aasd', 'sitename']

View File

@ -1,4 +1,4 @@
# Pytorch-Lightning Tests
# PyTorch-Lightning Tests
## Running tests
The automatic travis tests ONLY run CPU-based tests. Although these cover most of the use cases,