update org paths & convert logos (#685)
* fix typos * update org paths * update links from READMe to docs * add svg logo * add svg logo-text * update logos * testing temp paths * prune links from readme * optimize imports * update logo * update paths in README * missing imports
|
@ -6,7 +6,7 @@ We're currently recruiting for a team of 5 core maintainers.
|
|||
As a core maintainer you will have a strong say in the direction of the project. Big changes will require a majority of maintainers to agree.
|
||||
|
||||
### Code of conduct
|
||||
First and foremost, you'll be evaluated against [these core values](https://github.com/williamFalcon/pytorch-lightning/blob/master/.github/CONTRIBUTING.md). Any code we commit or feature we add needs to align with those core values.
|
||||
First and foremost, you'll be evaluated against [these core values](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/.github/CONTRIBUTING.md). Any code we commit or feature we add needs to align with those core values.
|
||||
|
||||
### The bar for joining the team
|
||||
Lightning is being used to solve really hard problems at the top AI labs in the world. As such, the bar for adding team members is extremely high. Candidates must have solid engineering skills, have a good eye for user experience, and must be a power user of Lightning and PyTorch.
|
||||
|
|
|
@ -8,8 +8,8 @@ assignees: ''
|
|||
---
|
||||
|
||||
### Common bugs:
|
||||
1. Tensorboard not showing in Jupyter-notebook see [issue 79](https://github.com/williamFalcon/pytorch-lightning/issues/79).
|
||||
2. PyTorch 1.1.0 vs 1.2.0 support [see FAQ](https://github.com/williamFalcon/pytorch-lightning#faq)
|
||||
1. Tensorboard not showing in Jupyter-notebook see [issue 79](https://github.com/PyTorchLightning/pytorch-lightning/issues/79).
|
||||
2. PyTorch 1.1.0 vs 1.2.0 support [see FAQ](https://github.com/PyTorchLightning/pytorch-lightning#faq)
|
||||
|
||||
## 🐛 Bug
|
||||
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
# Before submitting
|
||||
|
||||
- [ ] Was this discussed/approved via a Github issue? (no need for typos, doc improvements)
|
||||
- [ ] Did you read the [contributor guideline](https://github.com/williamFalcon/pytorch-lightning/blob/master/.github/CONTRIBUTING.md)?
|
||||
- [ ] Did you read the [contributor guideline](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/.github/CONTRIBUTING.md)?
|
||||
- [ ] Did you make sure to update the docs?
|
||||
- [ ] Did you write any new necessary tests?
|
||||
|
||||
|
|
|
@ -1,29 +1,26 @@
|
|||
# project
|
||||
.DS_Store
|
||||
.data/
|
||||
run_configs/
|
||||
test_tube_logs/
|
||||
test_tube_data/
|
||||
datasets/
|
||||
model_weights/
|
||||
app/models/
|
||||
pip-wheel-metadata/
|
||||
test_tube_exp/
|
||||
tests/tests_tt_dir/
|
||||
tests/save_dir
|
||||
default/
|
||||
lightning_logs/
|
||||
tests/tests/
|
||||
*.rst
|
||||
|
||||
# Test-tube
|
||||
test_tube_logs/
|
||||
test_tube_data/
|
||||
test_tube_exp/
|
||||
|
||||
# Documentations
|
||||
docs/source/pl_examples*.rst
|
||||
docs/source/pytorch_lightning*.rst
|
||||
/docs/source/*.md
|
||||
|
||||
# Byte-compiled / optimized / DLL files
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
example.py
|
||||
timit_data/
|
||||
LJSpeech-1.1/
|
||||
|
||||
# C extensions
|
||||
*.so
|
||||
|
@ -32,7 +29,6 @@ LJSpeech-1.1/
|
|||
|
||||
# Distribution / packaging
|
||||
.Python
|
||||
env/
|
||||
ide_layouts/
|
||||
build/
|
||||
develop-eggs/
|
||||
|
@ -44,7 +40,6 @@ lib/
|
|||
lib64/
|
||||
parts/
|
||||
sdist/
|
||||
var/
|
||||
wheels/
|
||||
*.egg-info/
|
||||
.installed.cfg
|
||||
|
@ -70,6 +65,9 @@ nosetests.xml
|
|||
coverage.xml
|
||||
*.cover
|
||||
.hypothesis/
|
||||
tests/tests_tt_dir/
|
||||
tests/save_dir
|
||||
tests/tests/
|
||||
|
||||
# Translations
|
||||
*.mo
|
||||
|
@ -87,7 +85,7 @@ instance/
|
|||
.scrapy
|
||||
|
||||
# Sphinx documentation
|
||||
docs/_build/
|
||||
docs/build/
|
||||
|
||||
# PyBuilder
|
||||
target/
|
||||
|
@ -109,6 +107,7 @@ celerybeat-schedule
|
|||
|
||||
# virtualenv
|
||||
.venv
|
||||
env/
|
||||
venv/
|
||||
ENV/
|
||||
|
||||
|
@ -126,4 +125,6 @@ ENV/
|
|||
.mypy_cache/
|
||||
|
||||
# data
|
||||
.data/
|
||||
datasets/
|
||||
mnist/
|
||||
|
|
|
@ -20,5 +20,5 @@ formats: all
|
|||
python:
|
||||
version: 3.7
|
||||
install:
|
||||
#- requirements: requirements.txt
|
||||
- requirements: docs/requirements.txt
|
||||
#- requirements: requirements.txt
|
||||
|
|
|
@ -1,7 +1,6 @@
|
|||
# Manifest syntax https://docs.python.org/2/distutils/sourcedist.html
|
||||
graft wheelhouse
|
||||
|
||||
recursive-include birl *.py
|
||||
recursive-exclude __pycache__ *.py[cod] *.orig
|
||||
|
||||
# Include the README
|
||||
|
|
103
README.md
|
@ -32,7 +32,7 @@ pip install pytorch-lightning
|
|||
```
|
||||
|
||||
## Docs
|
||||
**[View the docs here](https://pytorch-lightning.readthedocs.io/en/latest)**
|
||||
**[View the docs here](https://pytorch-lightning.rtfd.io/en/latest)**
|
||||
** DOCS TEMPORARILY have broken links because we recently switched orgs from williamfalcon/pytorch-lightning to pytorchlightning/pytorch-lightning [jan 15, 2020].
|
||||
|
||||
As a temporary hack, when you get the 404, replace williamfalcon.github.io with pytorchlightning.github.io.
|
||||
|
@ -84,12 +84,12 @@ Lightning sets up all the boilerplate state-of-the-art training for you so you c
|
|||
---
|
||||
|
||||
## How do I do use it?
|
||||
Think about Lightning as refactoring your research code instead of using a new framework. The research code goes into a [LightningModule](https://williamfalcon.github.io/pytorch-lightning/LightningModule/RequiredTrainerInterface/) which you fit using a Trainer.
|
||||
Think about Lightning as refactoring your research code instead of using a new framework. The research code goes into a [LightningModule](https://pytorch-lightning.rtfd.io/en/latest/LightningModule/RequiredTrainerInterface/) which you fit using a Trainer.
|
||||
|
||||
The LightningModule defines a *system* such as seq-2-seq, GAN, etc... It can ALSO define a simple classifier such as the example below.
|
||||
|
||||
To use lightning do 2 things:
|
||||
1. [Define a LightningModule](https://williamfalcon.github.io/pytorch-lightning/LightningModule/RequiredTrainerInterface/)
|
||||
1. [Define a LightningModule](https://pytorch-lightning.rtfd.io/en/latest/LightningModule/RequiredTrainerInterface/)
|
||||
**WARNING:** This syntax is for version 0.5.0+ where abbreviations were removed.
|
||||
```python
|
||||
import os
|
||||
|
@ -165,7 +165,7 @@ To use lightning do 2 things:
|
|||
# OPTIONAL
|
||||
return DataLoader(MNIST(os.getcwd(), train=False, download=True, transform=transforms.ToTensor()), batch_size=32)
|
||||
```
|
||||
2. Fit with a [trainer](https://williamfalcon.github.io/pytorch-lightning/Trainer/)
|
||||
2. Fit with a [trainer](https://pytorch-lightning.rtfd.io/en/latest/Trainer/)
|
||||
```python
|
||||
from pytorch_lightning import Trainer
|
||||
|
||||
|
@ -277,77 +277,20 @@ Lightning also adds a text column with all the hyperparameters for this experime
|
|||
|
||||
![tensorboard-support](docs/source/_static/images/tf_tags.png)
|
||||
|
||||
## Lightning automates all of the following ([each is also configurable](https://williamfalcon.github.io/pytorch-lightning/Trainer/)):
|
||||
|
||||
#### Checkpointing
|
||||
|
||||
- [Checkpoint callback](https://williamfalcon.github.io/pytorch-lightning/Trainer/Checkpointing/#model-saving)
|
||||
- [Model saving](https://williamfalcon.github.io/pytorch-lightning/Trainer/Checkpointing/#model-saving)
|
||||
- [Model loading](https://williamfalcon.github.io/pytorch-lightning/LightningModule/methods/#load-from-metrics)
|
||||
- [Restoring training session](https://williamfalcon.github.io/pytorch-lightning/Trainer/Checkpointing/#restoring-training-session)
|
||||
|
||||
#### Computing cluster (SLURM)
|
||||
|
||||
- [Running grid search on a cluster](https://williamfalcon.github.io/pytorch-lightning/Trainer/SLURM%20Managed%20Cluster#running-grid-search-on-a-cluster)
|
||||
- [Walltime auto-resubmit](https://williamfalcon.github.io/pytorch-lightning/Trainer/SLURM%20Managed%20Cluster#walltime-auto-resubmit)
|
||||
|
||||
#### Debugging
|
||||
|
||||
- [Fast dev run](https://williamfalcon.github.io/pytorch-lightning/Trainer/debugging/#fast-dev-run)
|
||||
- [Inspect gradient norms](https://williamfalcon.github.io/pytorch-lightning/Trainer/debugging/#inspect-gradient-norms)
|
||||
- [Log GPU usage](https://williamfalcon.github.io/pytorch-lightning/Trainer/debugging/#Log-gpu-usage)
|
||||
- [Make model overfit on subset of data](https://williamfalcon.github.io/pytorch-lightning/Trainer/debugging/#make-model-overfit-on-subset-of-data)
|
||||
- [Print the parameter count by layer](https://williamfalcon.github.io/pytorch-lightning/Trainer/debugging/#print-the-parameter-count-by-layer)
|
||||
- [Print which gradients are nan](https://williamfalcon.github.io/pytorch-lightning/Trainer/debugging/#print-which-gradients-are-nan)
|
||||
- [Print input and output size of every module in system](https://williamfalcon.github.io/pytorch-lightning/LightningModule/properties/#example_input_array)
|
||||
## Lightning automates all of the following ([each is also configurable](https://pytorch-lightning.rtfd.io/en/latest/pytorch_lightning.trainer.html)):
|
||||
|
||||
|
||||
#### Distributed training
|
||||
|
||||
- [Implement Your Own Distributed (DDP) training](https://williamfalcon.github.io/pytorch-lightning/Trainer/hooks/#init_ddp_connection)
|
||||
- [16-bit mixed precision](https://williamfalcon.github.io/pytorch-lightning/Trainer/Distributed%20training/#16-bit-mixed-precision)
|
||||
- [Multi-GPU](https://williamfalcon.github.io/pytorch-lightning/Trainer/Distributed%20training/#Multi-GPU)
|
||||
- [Multi-node](https://williamfalcon.github.io/pytorch-lightning/Trainer/Distributed%20training/#Multi-node)
|
||||
- [Single GPU](https://williamfalcon.github.io/pytorch-lightning/Trainer/Distributed%20training/#single-gpu)
|
||||
- [Self-balancing architecture](https://williamfalcon.github.io/pytorch-lightning/Trainer/Distributed%20training/#self-balancing-architecture)
|
||||
|
||||
|
||||
#### Experiment Logging
|
||||
|
||||
- [Display metrics in progress bar](https://williamfalcon.github.io/pytorch-lightning/Trainer/Logging/#display-metrics-in-progress-bar)
|
||||
- [Log metric row every k batches](https://williamfalcon.github.io/pytorch-lightning/Trainer/Logging/#log-metric-row-every-k-batches)
|
||||
- [Process position](https://williamfalcon.github.io/pytorch-lightning/Trainer/Logging/#process-position)
|
||||
- [Tensorboard support](https://williamfalcon.github.io/pytorch-lightning/Trainer/Logging/#tensorboard-support)
|
||||
- [Save a snapshot of all hyperparameters](https://williamfalcon.github.io/pytorch-lightning/Trainer/Logging/#save-a-snapshot-of-all-hyperparameters)
|
||||
- [Snapshot code for a training run](https://williamfalcon.github.io/pytorch-lightning/Trainer/Logging/#snapshot-code-for-a-training-run)
|
||||
- [Write logs file to csv every k batches](https://williamfalcon.github.io/pytorch-lightning/Trainer/Logging/#write-logs-file-to-csv-every-k-batches)
|
||||
- [Logging on W&B](https://williamfalcon.github.io/pytorch-lightning/Trainer/Logging/#w&b)
|
||||
- [Logging experiment data to Neptune](https://williamfalcon.github.io/pytorch-lightning/Trainer/Logging/#neptune-support)
|
||||
|
||||
#### Training loop
|
||||
|
||||
- [Accumulate gradients](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#accumulated-gradients)
|
||||
- [Force training for min or max epochs](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#force-training-for-min-or-max-epochs)
|
||||
- [Early stopping callback](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#early-stopping)
|
||||
- [Force disable early stop](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#force-disable-early-stop)
|
||||
- [Gradient Clipping](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#gradient-clipping)
|
||||
- [Hooks](https://williamfalcon.github.io/pytorch-lightning/Trainer/hooks/)
|
||||
- [Learning rate scheduling](https://williamfalcon.github.io/pytorch-lightning/LightningModule/RequiredTrainerInterface/#configure_optimizers)
|
||||
- [Use multiple optimizers (like GANs)](https://williamfalcon.github.io/pytorch-lightning/LightningModule/RequiredTrainerInterface/#configure_optimizers)
|
||||
- [Set how much of the training set to check (1-100%)](https://williamfalcon.github.io/pytorch-lightning/Trainer/Training%20Loop/#set-how-much-of-the-training-set-to-check)
|
||||
- [Step optimizers at arbitrary intervals](https://williamfalcon.github.io/pytorch-lightning/Trainer/hooks/#optimizer_step)
|
||||
|
||||
#### Validation loop
|
||||
|
||||
- [Check validation every n epochs](https://williamfalcon.github.io/pytorch-lightning/Trainer/Validation%20loop/#check-validation-every-n-epochs)
|
||||
- [Hooks](https://williamfalcon.github.io/pytorch-lightning/Trainer/hooks/)
|
||||
- [Set how much of the validation set to check](https://williamfalcon.github.io/pytorch-lightning/Trainer/Validation%20loop/#set-how-much-of-the-validation-set-to-check)
|
||||
- [Set how much of the test set to check](https://williamfalcon.github.io/pytorch-lightning/Trainer/Validation%20loop/#set-how-much-of-the-test-set-to-check)
|
||||
- [Set validation check frequency within 1 training epoch](https://williamfalcon.github.io/pytorch-lightning/Trainer/Validation%20loop/#set-validation-check-frequency-within-1-training-epoch)
|
||||
- [Set the number of validation sanity steps](https://williamfalcon.github.io/pytorch-lightning/Trainer/Validation%20loop/#set-the-number-of-validation-sanity-steps)
|
||||
|
||||
#### Testing loop
|
||||
- [Run test set](https://williamfalcon.github.io/pytorch-lightning/Trainer/Testing%20loop/)
|
||||
- [Running grid search on a cluster](https://pytorch-lightning.rtfd.io/en/latest/pytorch_lightning.trainer.distrib_data_parallel.html)
|
||||
- [Fast dev run](https://pytorch-lightning.rtfd.io/en/latest/pytorch_lightning.utilities.debugging.html)
|
||||
- [Logging](https://pytorch-lightning.rtfd.io/en/latest/pytorch_lightning.logging.html)
|
||||
- [Implement Your Own Distributed (DDP) training](https://pytorch-lightning.rtfd.io/en/latest/pytorch_lightning.core.lightning.html#pytorch_lightning.core.lightning.LightningModule.configure_ddp)
|
||||
- [Multi-GPU & Multi-node](https://pytorch-lightning.rtfd.io/en/latest/pytorch_lightning.trainer.distrib_parts.html)
|
||||
- [Training loop](https://pytorch-lightning.rtfd.io/en/latest/pytorch_lightning.trainer.training_loop.html)
|
||||
- [Hooks](https://pytorch-lightning.rtfd.io/en/latest/pytorch_lightning.core.hooks.html)
|
||||
- [Configure optimizers](https://pytorch-lightning.rtfd.io/en/latest/pytorch_lightning.core.lightning.html#pytorch_lightning.core.lightning.LightningModule.configure_optimizers)
|
||||
- [Validations](https://pytorch-lightning.rtfd.io/en/latest/pytorch_lightning.trainer.evaluation_loop.html)
|
||||
- [Model saving & Restoring training session](https://pytorch-lightning.rtfd.io/en/latest/pytorch_lightning.trainer.training_io.html)
|
||||
|
||||
|
||||
## Examples
|
||||
- [GAN](https://github.com/PytorchLightning/pytorch-lightning/tree/master/pl_examples/domain_templates/gan.py)
|
||||
|
@ -366,16 +309,18 @@ Lightning also adds a text column with all the hyperparameters for this experime
|
|||
Welcome to the Lightning community!
|
||||
|
||||
If you have any questions, feel free to:
|
||||
1. [read the docs](https://williamfalcon.github.io/pytorch-lightning/).
|
||||
1. [read the docs](https://pytorch-lightning.rtfd.io/en/latest/).
|
||||
2. [Search through the issues](https://github.com/PytorchLightning/pytorch-lightning/issues?utf8=%E2%9C%93&q=my++question).
|
||||
3. [Ask on stackoverflow](https://stackoverflow.com/questions/ask?guided=false) with the tag pytorch-lightning.
|
||||
|
||||
If no one replies to you quickly enough, feel free to post the stackoverflow link to our [Slack channel](https://join.slack.com/t/pytorch-lightning/shared_invite/enQtODU5ODIyNTUzODQwLTFkMDg5Mzc1MDBmNjEzMDgxOTVmYTdhYjA1MDdmODUyOTg2OGQ1ZWZkYTQzODhhNzdhZDA3YmNhMDhlMDY4YzQ)!
|
||||
If no one replies to you quickly enough, feel free to post the stackoverflow link to our Gitter chat!
|
||||
|
||||
To chat with the rest of us visit our [gitter channel](https://gitter.im/PyTorch-Lightning/community)!
|
||||
|
||||
---
|
||||
## FAQ
|
||||
**How do I use Lightning for rapid research?**
|
||||
[Here's a walk-through](https://williamfalcon.github.io/pytorch-lightning/)
|
||||
[Here's a walk-through](https://pytorch-lightning.rtfd.io/en/latest/)
|
||||
|
||||
**Why was Lightning created?**
|
||||
Lightning has 3 goals in mind:
|
||||
|
@ -425,16 +370,16 @@ If you can't wait for the next release, install the most up to date code with:
|
|||
|
||||
### Any release installation
|
||||
|
||||
You can also install any past release from this repository:
|
||||
You can also install any past release `0.X.Y` from this repository:
|
||||
```bash
|
||||
pip install https://github.com/PytorchLightning/pytorch-lightning/archive/0.4.4.zip --upgrade
|
||||
pip install https://github.com/PytorchLightning/pytorch-lightning/archive/0.X.Y.zip --upgrade
|
||||
```
|
||||
|
||||
## Bibtex
|
||||
If you want to cite the framework feel free to use this (but only if you loved it 😊):
|
||||
```
|
||||
@misc{Falcon2019,
|
||||
author = {Falcon, W.A.},
|
||||
author = {Falcon, W.A. et al.},
|
||||
title = {PyTorch Lightning},
|
||||
year = {2019},
|
||||
publisher = {GitHub},
|
||||
|
|
|
@ -0,0 +1,62 @@
|
|||
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
|
||||
<svg
|
||||
xmlns:dc="http://purl.org/dc/elements/1.1/"
|
||||
xmlns:cc="http://creativecommons.org/ns#"
|
||||
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
|
||||
xmlns:svg="http://www.w3.org/2000/svg"
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
|
||||
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
|
||||
id="svg"
|
||||
version="1.1"
|
||||
width="16.000004"
|
||||
height="15.999986"
|
||||
viewBox="0 0 16.000004 15.999986"
|
||||
sodipodi:docname="lightning_icon.svg"
|
||||
inkscape:version="0.92.3 (2405546, 2018-03-11)">
|
||||
<metadata
|
||||
id="metadata13">
|
||||
<rdf:RDF>
|
||||
<cc:Work
|
||||
rdf:about="">
|
||||
<dc:format>image/svg+xml</dc:format>
|
||||
<dc:type
|
||||
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
|
||||
<dc:title></dc:title>
|
||||
</cc:Work>
|
||||
</rdf:RDF>
|
||||
</metadata>
|
||||
<defs
|
||||
id="defs11" />
|
||||
<sodipodi:namedview
|
||||
pagecolor="#ffffff"
|
||||
bordercolor="#666666"
|
||||
borderopacity="1"
|
||||
objecttolerance="10"
|
||||
gridtolerance="10"
|
||||
guidetolerance="10"
|
||||
inkscape:pageopacity="0"
|
||||
inkscape:pageshadow="2"
|
||||
inkscape:window-width="1920"
|
||||
inkscape:window-height="1028"
|
||||
id="namedview9"
|
||||
showgrid="false"
|
||||
inkscape:zoom="0.59"
|
||||
inkscape:cx="-669.05062"
|
||||
inkscape:cy="373.84245"
|
||||
inkscape:window-x="0"
|
||||
inkscape:window-y="0"
|
||||
inkscape:window-maximized="1"
|
||||
inkscape:current-layer="svg" />
|
||||
<path
|
||||
style="fill:#fbfbfb;fill-rule:evenodd;stroke:none;stroke-width:0.04002798"
|
||||
inkscape:connector-curvature="0"
|
||||
d="m 8.987101,1.723485 c -0.05588,0.03422 -4.121881,4.096544 -4.184645,4.180924 -0.02317,0.0311 -0.04587,0.06016 -0.05044,0.06456 -0.0087,0.0084 -0.07477,0.145063 -0.09679,0.20014 -0.05848,0.146583 -0.05804,0.44387 0.001,0.592413 0.08426,0.21243 0.08826,0.216754 1.576864,1.706274 0.779463,0.779947 1.41719,1.426877 1.41719,1.437604 0,0.0232 -0.253177,0.79848 -0.273873,0.838707 -0.0079,0.0153 -0.01433,0.04087 -0.01433,0.05684 0,0.01597 -0.0059,0.03587 -0.01313,0.04423 -0.0072,0.0084 -0.03678,0.09086 -0.06568,0.18333 -0.02893,0.09246 -0.05904,0.180647 -0.06693,0.195937 -0.0079,0.0153 -0.01437,0.04087 -0.01437,0.05684 0,0.01597 -0.0059,0.03586 -0.01313,0.04423 -0.0072,0.0084 -0.03679,0.09086 -0.06569,0.18333 -0.02893,0.09246 -0.05904,0.180643 -0.06693,0.195937 -0.0079,0.0153 -0.01437,0.04187 -0.01437,0.05908 0,0.0172 -0.0072,0.03574 -0.016,0.04119 -0.0088,0.0054 -0.016,0.02607 -0.016,0.04579 0,0.01973 -0.006,0.04271 -0.0134,0.05108 -0.0074,0.0084 -0.04439,0.112477 -0.08222,0.23136 -0.03787,0.118884 -0.151103,0.461124 -0.251693,0.760534 -0.489984,1.45874 -0.462444,1.36155 -0.413611,1.45938 0.06917,0.138657 0.23128,0.199741 0.358251,0.134974 0.07057,-0.03602 4.143298,-4.099985 4.245368,-4.236242 0.03382,-0.04515 0.09094,-0.165796 0.109916,-0.232123 0.0088,-0.03083 0.0243,-0.08498 0.03442,-0.120363 0.03346,-0.11668 0.0068,-0.361134 -0.0566,-0.520084 C 10.880518,9.229614 10.738898,9.079187 9.372744,7.714673 8.601524,6.944416 7.970523,6.302806 7.970523,6.288916 c 0,-0.01393 0.02817,-0.107833 0.0626,-0.208663 0.03442,-0.100834 0.07881,-0.237367 0.09859,-0.303414 0.0198,-0.06605 0.04207,-0.12693 0.04947,-0.135293 0.0074,-0.0084 0.0135,-0.03133 0.0135,-0.05108 0,-0.01973 0.0072,-0.04035 0.016,-0.04579 0.0088,-0.0054 0.016,-0.02707 0.016,-0.04803 0,-0.02097 0.0072,-0.04259 0.016,-0.04803 0.0088,-0.0054 0.016,-0.02707 0.016,-0.04803 0,-0.02097 0.0072,-0.04259 0.016,-0.04803 0.0088,-0.0054 0.016,-0.02707 0.016,-0.04803 0,-0.02097 0.0072,-0.04259 0.016,-0.04803 0.0088,-0.0054 0.016,-0.02707 0.016,-0.04803 0,-0.02097 0.0072,-0.04259 0.016,-0.04803 0.0088,-0.0054 0.016,-0.02707 0.016,-0.04803 0,-0.02097 0.0072,-0.04259 0.016,-0.04803 0.0088,-0.0054 0.016,-0.02707 0.016,-0.04803 0,-0.02097 0.0072,-0.04259 0.016,-0.04804 0.0088,-0.0054 0.016,-0.02707 0.016,-0.04803 0,-0.02097 0.0072,-0.04259 0.016,-0.04803 0.0088,-0.0054 0.016,-0.02707 0.016,-0.04803 0,-0.02097 0.0072,-0.04259 0.016,-0.04803 0.0088,-0.0054 0.016,-0.02707 0.016,-0.04803 0,-0.02097 0.0072,-0.04259 0.016,-0.04803 0.0088,-0.0054 0.016,-0.02707 0.016,-0.04803 0,-0.02097 0.0072,-0.04259 0.016,-0.04803 0.0088,-0.0054 0.016,-0.02707 0.016,-0.04803 0,-0.02097 0.0072,-0.04259 0.016,-0.04803 0.0088,-0.0054 0.016,-0.02707 0.016,-0.04803 0,-0.02097 0.0072,-0.04259 0.016,-0.04803 0.0088,-0.0054 0.016,-0.02707 0.016,-0.04803 0,-0.02097 0.0072,-0.04259 0.016,-0.04804 0.0088,-0.0054 0.016,-0.02707 0.016,-0.04803 0,-0.02097 0.0072,-0.04259 0.016,-0.04803 0.0088,-0.0054 0.016,-0.02397 0.016,-0.04119 0,-0.0172 0.0065,-0.04379 0.0144,-0.05908 0.0079,-0.0153 0.119204,-0.34484 0.247334,-0.73231 C 9.064507,2.979766 9.220177,2.513319 9.28226,2.330632 9.408267,1.960092 9.41367,1.921146 9.35255,1.826839 9.27225,1.703032 9.099973,1.654399 8.986893,1.723566"
|
||||
id="path0" />
|
||||
<path
|
||||
style="fill:#540c8c;fill-rule:evenodd;stroke:none;stroke-width:0.04002798"
|
||||
inkscape:connector-curvature="0"
|
||||
d="m 0.07719102,0.01733399 c -0.02187,0.0111 -0.04875,0.03799 -0.05984,0.05984 -0.0161,0.03173 -0.01937,1.62421701 -0.01633,7.94479601 l 0.0038,7.905086 0.03647,0.03646 0.03646,0.03647 H 8.00241 15.927073 l 0.03646,-0.03647 0.03647,-0.03646 V 8.002393 0.07773399 l -0.03647,-0.03646 -0.03646,-0.03647 -7.905086,-0.0038 c -6.320579,-0.003 -7.91305298,2.4e-4 -7.94479598,0.01633 M 9.193764,1.668208 c 0.259903,0.09046 0.275193,0.212427 0.09363,0.74628 C 8.845834,3.776859 8.388843,5.102846 7.991127,6.302606 L 9.415644,7.72492 c 1.24415,1.242111 1.51682,1.523547 1.51682,1.565414 0,0.0051 0.0133,0.03987 0.02953,0.07718 0.12913,0.296607 0.0877,0.664983 -0.103314,0.91872 -0.141456,0.187933 -4.207341,4.228478 -4.273468,4.246848 -0.139417,0.03871 -0.248653,-0.006 -0.34324,-0.140417 -0.07665,-0.108996 -0.06985,-0.137256 0.287004,-1.194633 0.34663,-1.101761 0.75901,-2.243218 1.08916,-3.290661 0,-0.0078 -0.636164,-0.650377 -1.413707,-1.427921 C 4.877658,7.152643 4.728155,6.995813 4.673718,6.87361 4.661948,6.84718 4.645988,6.81305 4.638168,6.79776 4.630368,6.78246 4.624038,6.75689 4.624038,6.74092 c 0,-0.01597 -0.0076,-0.03659 -0.01687,-0.04587 -0.02253,-0.02253 -0.02253,-0.436904 0,-0.45944 0.0093,-0.0093 0.01687,-0.0327 0.01687,-0.05204 0,-0.0363 0.06917,-0.178363 0.130414,-0.267907 0.07965,-0.1164 4.221831,-4.237681 4.259458,-4.237921 0.02047,-1.2e-4 0.04803,-0.0072 0.06124,-0.01577 0.03147,-0.02033 0.04415,-0.01967 0.118603,0.0062"
|
||||
id="path1"
|
||||
sodipodi:nodetypes="ccscccccccccccscccccscccccccccsssscccc" />
|
||||
</svg>
|
After Width: | Height: | Size: 6.4 KiB |
|
@ -0,0 +1,61 @@
|
|||
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
|
||||
<svg
|
||||
xmlns:dc="http://purl.org/dc/elements/1.1/"
|
||||
xmlns:cc="http://creativecommons.org/ns#"
|
||||
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
|
||||
xmlns:svg="http://www.w3.org/2000/svg"
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
|
||||
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
|
||||
id="svg"
|
||||
version="1.1"
|
||||
width="400"
|
||||
height="400"
|
||||
viewBox="0, 0, 400,400"
|
||||
sodipodi:docname="lightning_logo.svg"
|
||||
inkscape:version="0.92.3 (2405546, 2018-03-11)">
|
||||
<metadata
|
||||
id="metadata13">
|
||||
<rdf:RDF>
|
||||
<cc:Work
|
||||
rdf:about="">
|
||||
<dc:format>image/svg+xml</dc:format>
|
||||
<dc:type
|
||||
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
|
||||
</cc:Work>
|
||||
</rdf:RDF>
|
||||
</metadata>
|
||||
<defs
|
||||
id="defs11" />
|
||||
<sodipodi:namedview
|
||||
pagecolor="#ffffff"
|
||||
bordercolor="#666666"
|
||||
borderopacity="1"
|
||||
objecttolerance="10"
|
||||
gridtolerance="10"
|
||||
guidetolerance="10"
|
||||
inkscape:pageopacity="0"
|
||||
inkscape:pageshadow="2"
|
||||
inkscape:window-width="1920"
|
||||
inkscape:window-height="1028"
|
||||
id="namedview9"
|
||||
showgrid="false"
|
||||
inkscape:zoom="9.44"
|
||||
inkscape:cx="203.07907"
|
||||
inkscape:cy="335.32491"
|
||||
inkscape:window-x="0"
|
||||
inkscape:window-y="0"
|
||||
inkscape:window-maximized="1"
|
||||
inkscape:current-layer="svg" />
|
||||
<path
|
||||
style="fill:#fbfbfb;fill-rule:evenodd;stroke:none"
|
||||
inkscape:connector-curvature="0"
|
||||
d="m 224.6,43.137 c -1.396,0.855 -102.975,102.342 -104.543,104.45 -0.579,0.777 -1.146,1.503 -1.26,1.613 -0.218,0.21 -1.868,3.624 -2.418,5 -1.461,3.662 -1.45,11.089 0.022,14.8 2.105,5.307 2.205,5.415 39.394,42.627 19.473,19.485 35.405,35.647 35.405,35.915 0,0.58 -6.325,19.948 -6.842,20.953 -0.197,0.382 -0.358,1.021 -0.358,1.42 0,0.399 -0.147,0.896 -0.328,1.105 -0.18,0.209 -0.919,2.27 -1.641,4.58 -0.723,2.31 -1.475,4.513 -1.672,4.895 -0.198,0.382 -0.359,1.021 -0.359,1.42 0,0.399 -0.147,0.896 -0.328,1.105 -0.18,0.209 -0.919,2.27 -1.641,4.58 -0.723,2.31 -1.475,4.513 -1.672,4.895 -0.198,0.382 -0.359,1.046 -0.359,1.476 0,0.43 -0.18,0.893 -0.4,1.029 -0.22,0.136 -0.4,0.651 -0.4,1.144 0,0.493 -0.151,1.067 -0.335,1.276 -0.184,0.209 -1.109,2.81 -2.054,5.78 -0.946,2.97 -3.775,11.52 -6.288,19 -12.241,36.443 -11.553,34.015 -10.333,36.459 1.728,3.464 5.778,4.99 8.95,3.372 1.763,-0.9 103.51,-102.428 106.06,-105.832 0.845,-1.128 2.272,-4.142 2.746,-5.799 0.22,-0.77 0.607,-2.123 0.86,-3.007 0.836,-2.915 0.171,-9.022 -1.414,-12.993 -1.493,-3.741 -5.031,-7.499 -39.161,-41.588 C 214.964,173.569 199.2,157.54 199.2,157.193 c 0,-0.348 0.704,-2.694 1.564,-5.213 0.86,-2.519 1.969,-5.93 2.463,-7.58 0.495,-1.65 1.051,-3.171 1.236,-3.38 0.186,-0.209 0.337,-0.783 0.337,-1.276 0,-0.493 0.18,-1.008 0.4,-1.144 0.22,-0.136 0.4,-0.676 0.4,-1.2 0,-0.524 0.18,-1.064 0.4,-1.2 0.22,-0.136 0.4,-0.676 0.4,-1.2 0,-0.524 0.18,-1.064 0.4,-1.2 0.22,-0.136 0.4,-0.676 0.4,-1.2 0,-0.524 0.18,-1.064 0.4,-1.2 0.22,-0.136 0.4,-0.676 0.4,-1.2 0,-0.524 0.18,-1.064 0.4,-1.2 0.22,-0.136 0.4,-0.676 0.4,-1.2 0,-0.524 0.18,-1.064 0.4,-1.2 0.22,-0.136 0.4,-0.676 0.4,-1.2 0,-0.524 0.18,-1.064 0.4,-1.2 0.22,-0.136 0.4,-0.676 0.4,-1.2 0,-0.524 0.18,-1.064 0.4,-1.2 0.22,-0.136 0.4,-0.676 0.4,-1.2 0,-0.524 0.18,-1.064 0.4,-1.2 0.22,-0.136 0.4,-0.676 0.4,-1.2 0,-0.524 0.18,-1.064 0.4,-1.2 0.22,-0.136 0.4,-0.676 0.4,-1.2 0,-0.524 0.18,-1.064 0.4,-1.2 0.22,-0.136 0.4,-0.676 0.4,-1.2 0,-0.524 0.18,-1.064 0.4,-1.2 0.22,-0.136 0.4,-0.676 0.4,-1.2 0,-0.524 0.18,-1.064 0.4,-1.2 0.22,-0.136 0.4,-0.676 0.4,-1.2 0,-0.524 0.18,-1.064 0.4,-1.2 0.22,-0.136 0.4,-0.676 0.4,-1.2 0,-0.524 0.18,-1.064 0.4,-1.2 0.22,-0.136 0.4,-0.599 0.4,-1.029 0,-0.43 0.162,-1.094 0.36,-1.476 0.197,-0.382 2.978,-8.615 6.179,-18.295 3.2,-9.68 7.089,-21.333 8.64,-25.897 3.148,-9.257 3.283,-10.23 1.756,-12.586 -2.006,-3.093 -6.31,-4.308 -9.135,-2.58"
|
||||
id="path0" />
|
||||
<path
|
||||
style="fill:#540c8c;fill-rule:evenodd;stroke:none"
|
||||
inkscape:connector-curvature="0"
|
||||
d="M 2.008,0.513 C 1.462,0.79 0.79,1.462 0.513,2.008 0.111,2.801 0.029,42.585 0.105,200.489 L 0.2,397.978 1.111,398.889 2.022,399.8 H 200 397.978 l 0.911,-0.911 0.911,-0.911 V 200 2.022 L 398.889,1.111 397.978,0.2 200.489,0.105 C 42.585,0.029 2.801,0.111 2.008,0.513 m 227.755,41.243 c 6.493,2.26 6.875,5.307 2.339,18.644 -11.0313,34.035452 -22.44803,67.16196 -32.384,97.135 l 35.588,35.533 c 31.082,31.031 37.894,38.062 37.894,39.108 0,0.128 0.332,0.996 0.738,1.928 3.226,7.41 2.191,16.613 -2.581,22.952 -3.534,4.695 -105.11,105.638 -106.762,106.097 -3.483,0.967 -6.212,-0.15 -8.575,-3.508 -1.915,-2.723 -1.745,-3.429 7.17,-29.845 8.65971,-27.52475 18.96205,-56.04122 27.21,-82.209 0,-0.195 -15.893,-16.248 -35.318,-35.673 -33.146,-33.147 -36.881,-37.065 -38.241,-40.118 -0.294,-0.66 -0.693,-1.513 -0.888,-1.895 -0.194,-0.382 -0.353,-1.021 -0.353,-1.42 0,-0.399 -0.189,-0.914 -0.421,-1.146 -0.563,-0.563 -0.563,-10.915 0,-11.478 0.232,-0.232 0.421,-0.817 0.421,-1.3 0,-0.907 1.728,-4.456 3.258,-6.693 C 120.848,144.96 224.33,42 225.27,41.994 c 0.511,-0.003 1.2,-0.181 1.53,-0.394 0.786,-0.508 1.103,-0.491 2.963,0.156"
|
||||
id="path1"
|
||||
sodipodi:nodetypes="ccscccccccccccscccccscccccccccsssscccc" />
|
||||
</svg>
|
After Width: | Height: | Size: 5.2 KiB |
After Width: | Height: | Size: 15 KiB |
Before Width: | Height: | Size: 11 KiB After Width: | Height: | Size: 8.3 KiB |
|
@ -0,0 +1,62 @@
|
|||
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
|
||||
<svg
|
||||
xmlns:dc="http://purl.org/dc/elements/1.1/"
|
||||
xmlns:cc="http://creativecommons.org/ns#"
|
||||
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
|
||||
xmlns:svg="http://www.w3.org/2000/svg"
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
|
||||
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
|
||||
id="svg"
|
||||
version="1.1"
|
||||
width="47.999985"
|
||||
height="47.999943"
|
||||
viewBox="0 0 47.999985 47.999943"
|
||||
sodipodi:docname="lightning_logo.svg"
|
||||
inkscape:version="0.92.3 (2405546, 2018-03-11)">
|
||||
<metadata
|
||||
id="metadata13">
|
||||
<rdf:RDF>
|
||||
<cc:Work
|
||||
rdf:about="">
|
||||
<dc:format>image/svg+xml</dc:format>
|
||||
<dc:type
|
||||
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
|
||||
<dc:title />
|
||||
</cc:Work>
|
||||
</rdf:RDF>
|
||||
</metadata>
|
||||
<defs
|
||||
id="defs11" />
|
||||
<sodipodi:namedview
|
||||
pagecolor="#ffffff"
|
||||
bordercolor="#666666"
|
||||
borderopacity="1"
|
||||
objecttolerance="10"
|
||||
gridtolerance="10"
|
||||
guidetolerance="10"
|
||||
inkscape:pageopacity="0"
|
||||
inkscape:pageshadow="2"
|
||||
inkscape:window-width="1920"
|
||||
inkscape:window-height="1028"
|
||||
id="namedview9"
|
||||
showgrid="false"
|
||||
inkscape:zoom="0.59"
|
||||
inkscape:cx="-347.96588"
|
||||
inkscape:cy="389.84243"
|
||||
inkscape:window-x="0"
|
||||
inkscape:window-y="0"
|
||||
inkscape:window-maximized="1"
|
||||
inkscape:current-layer="svg" />
|
||||
<path
|
||||
style="fill:#fbfbfb;fill-rule:evenodd;stroke:none;stroke-width:0.12008391"
|
||||
inkscape:connector-curvature="0"
|
||||
d="m 26.961294,5.1704519 c -0.16764,0.10267 -12.36564,12.2896301 -12.55393,12.5427701 -0.0695,0.0933 -0.13762,0.18048 -0.15131,0.19369 -0.0262,0.0252 -0.22432,0.43519 -0.29036,0.60042 -0.17544,0.43975 -0.17412,1.33161 0.003,1.77724 0.25278,0.63729 0.26479,0.65026 4.73059,5.11882 2.33839,2.33984 4.25157,4.28063 4.25157,4.31281 0,0.0696 -0.75953,2.39544 -0.82162,2.51612 -0.0237,0.0459 -0.043,0.12261 -0.043,0.17052 0,0.0479 -0.0177,0.1076 -0.0394,0.13269 -0.0216,0.0251 -0.11035,0.27259 -0.19705,0.54999 -0.0868,0.27739 -0.17713,0.54194 -0.20078,0.58781 -0.0238,0.0459 -0.0431,0.1226 -0.0431,0.17052 0,0.0479 -0.0177,0.10759 -0.0394,0.13269 -0.0216,0.0251 -0.11036,0.27259 -0.19706,0.54999 -0.0868,0.27739 -0.17712,0.54193 -0.20078,0.58781 -0.0238,0.0459 -0.0431,0.1256 -0.0431,0.17724 0,0.0516 -0.0216,0.10723 -0.048,0.12357 -0.0264,0.0163 -0.048,0.0782 -0.048,0.13737 0,0.0592 -0.0181,0.12813 -0.0402,0.15323 -0.0221,0.0251 -0.13318,0.33743 -0.24666,0.69408 -0.1136,0.35665 -0.45331,1.38337 -0.75508,2.2816 -1.46995,4.37622 -1.38733,4.08465 -1.24083,4.37814 0.2075,0.41597 0.69384,0.59922 1.07475,0.40492 0.21171,-0.10807 12.42989,-12.29995 12.7361,-12.70872 0.10147,-0.13545 0.27283,-0.49739 0.32975,-0.69637 0.0264,-0.0925 0.0729,-0.25493 0.10327,-0.36109 0.10039,-0.35004 0.0205,-1.0834 -0.1698,-1.56025 -0.17928,-0.44923 -0.60414,-0.90051 -4.7026,-4.99405 -2.31366,-2.31077 -4.20666,-4.2356 -4.20666,-4.27727 0,-0.0418 0.0845,-0.3235 0.18781,-0.62599 0.10327,-0.3025 0.23644,-0.7121 0.29577,-0.91024 0.0594,-0.19814 0.1262,-0.38079 0.14842,-0.40588 0.0223,-0.0251 0.0405,-0.094 0.0405,-0.15323 0,-0.0592 0.0216,-0.12105 0.048,-0.13738 0.0264,-0.0163 0.048,-0.0812 0.048,-0.1441 0,-0.0629 0.0216,-0.12777 0.048,-0.1441 0.0264,-0.0163 0.048,-0.0812 0.048,-0.1441 0,-0.0629 0.0216,-0.12777 0.048,-0.1441 0.0264,-0.0163 0.048,-0.0812 0.048,-0.1441 0,-0.0629 0.0216,-0.12777 0.048,-0.1441 0.0264,-0.0163 0.048,-0.0812 0.048,-0.1441 0,-0.0629 0.0216,-0.12777 0.048,-0.1441 0.0264,-0.0163 0.048,-0.0812 0.048,-0.1441 0,-0.0629 0.0216,-0.12777 0.048,-0.1441 0.0264,-0.0163 0.048,-0.0812 0.048,-0.1441 0,-0.0629 0.0216,-0.12777 0.048,-0.14411 0.0264,-0.0163 0.048,-0.0812 0.048,-0.1441 0,-0.0629 0.0216,-0.12777 0.048,-0.1441 0.0264,-0.0163 0.048,-0.0812 0.048,-0.1441 0,-0.0629 0.0216,-0.12777 0.048,-0.1441 0.0264,-0.0163 0.048,-0.0812 0.048,-0.1441 0,-0.0629 0.0216,-0.12777 0.048,-0.1441 0.0264,-0.0163 0.048,-0.0812 0.048,-0.1441 0,-0.0629 0.0216,-0.12777 0.048,-0.1441 0.0264,-0.0163 0.048,-0.0812 0.048,-0.1441 0,-0.0629 0.0216,-0.12777 0.048,-0.1441 0.0264,-0.0163 0.048,-0.0812 0.048,-0.1441 0,-0.0629 0.0216,-0.12777 0.048,-0.1441 0.0264,-0.0163 0.048,-0.0812 0.048,-0.1441 0,-0.0629 0.0216,-0.12777 0.048,-0.14411 0.0264,-0.0163 0.048,-0.0812 0.048,-0.1441 0,-0.0629 0.0216,-0.12777 0.048,-0.1441 0.0264,-0.0163 0.048,-0.0719 0.048,-0.12356 0,-0.0516 0.0195,-0.13137 0.0432,-0.17725 0.0237,-0.0459 0.35761,-1.03452 0.742,-2.19693 0.38427,-1.1624101 0.85128,-2.5617501 1.03753,-3.1098101 0.37802,-1.11162 0.39423,-1.22846 0.21087,-1.51138 -0.24089,-0.37142 -0.75773,-0.51732 -1.09697,-0.30982"
|
||||
id="path0" />
|
||||
<path
|
||||
style="fill:#540c8c;fill-rule:evenodd;stroke:none;stroke-width:0.12008391"
|
||||
inkscape:connector-curvature="0"
|
||||
d="m 0.2315739,0.05200186 c -0.0656,0.0333 -0.14626,0.11396 -0.17952,0.17952 -0.0483,0.0952 -0.0581,4.87265004 -0.049,23.83438014 l 0.0114,23.71525 0.1094,0.10939 0.10939,0.1094 h 23.7739701 23.77398 l 0.10939,-0.1094 0.1094,-0.10939 V 24.007172 0.23320186 l -0.1094,-0.10939 -0.10939,-0.1094 -23.71525,-0.0114 c -18.9617301,-0.009 -23.7391501,7.2e-4 -23.8343801,0.049 M 27.581274,5.0046319 c 0.77971,0.27139 0.82558,0.63728 0.28088,2.23884 -1.32468,4.0871101 -2.69565,8.0650701 -3.8888,11.6643501 l 4.27355,4.26694 c 3.73245,3.72633 4.55046,4.57064 4.55046,4.69624 0,0.0154 0.0399,0.11961 0.0886,0.23153 0.38739,0.88982 0.2631,1.99495 -0.30994,2.75616 -0.42437,0.5638 -12.62202,12.68543 -12.8204,12.74054 -0.41825,0.11613 -0.74596,-0.018 -1.02972,-0.42125 -0.22996,-0.32699 -0.20954,-0.41177 0.86101,-3.5839 1.03989,-3.30528 2.27703,-6.72965 3.26748,-9.87198 0,-0.0234 -1.90849,-1.95113 -4.24112,-4.28376 -3.98031,-3.98042 -4.42882,-4.45091 -4.59213,-4.81752 -0.0353,-0.0793 -0.0832,-0.18169 -0.10664,-0.22756 -0.0233,-0.0459 -0.0424,-0.12261 -0.0424,-0.17052 0,-0.0479 -0.0227,-0.10976 -0.0506,-0.13762 -0.0676,-0.0676 -0.0676,-1.31071 0,-1.37832 0.0279,-0.0279 0.0506,-0.0981 0.0506,-0.15611 0,-0.10891 0.20751,-0.53509 0.39124,-0.80372 0.23896,-0.3492 12.66549,-12.7130401 12.77837,-12.7137601 0.0614,-3.6e-4 0.1441,-0.0217 0.18372,-0.0473 0.0944,-0.061 0.13246,-0.059 0.35581,0.0187"
|
||||
id="path1"
|
||||
sodipodi:nodetypes="ccscccccccccccscccccscccccccccsssscccc" />
|
||||
</svg>
|
After Width: | Height: | Size: 6.2 KiB |
Before Width: | Height: | Size: 8.3 KiB |
Before Width: | Height: | Size: 2.6 KiB |
|
@ -2,16 +2,16 @@
|
|||
'github': 'https://github.com/PytorchLightning/pytorch-lightning',
|
||||
'github_issues': 'https://github.com/PytorchLightning/pytorch-lightning/issues',
|
||||
'contributing': 'https://github.com/PytorchLightning/pytorch-lightning/blob/master/CONTRIBUTING.md',
|
||||
'docs': 'https://pytorchlightning.github.io/pytorch-lightning',
|
||||
'docs': 'https://pytorch-lightning.rtfd.io/en/latest',
|
||||
'twitter': 'https://twitter.com/PyTorchLightnin',
|
||||
'discuss': 'https://discuss.pytorch.org',
|
||||
'tutorials': 'https://pytorchlightning.github.io/pytorch-lightning/',
|
||||
'previous_pytorch_versions': 'https://pytorchlightning.github.io/pytorch-lightning/',
|
||||
'home': 'https://pytorchlightning.github.io/pytorch-lightning/',
|
||||
'get_started': 'https://pytorchlightning.github.io/pytorch-lightning/',
|
||||
'features': 'https://pytorchlightning.github.io/pytorch-lightning/',
|
||||
'blog': 'https://pytorchlightning.github.io/pytorch-lightning/',
|
||||
'resources': 'https://pytorchlightning.github.io/pytorch-lightning/',
|
||||
'support': 'https://pytorchlightning.github.io/pytorch-lightning/',
|
||||
'tutorials': 'https://pytorch-lightning.rtfd.io/en/latest/',
|
||||
'previous_pytorch_versions': 'https://pytorch-lightning.rtfd.io/en/latest/',
|
||||
'home': 'https://pytorch-lightning.rtfd.io/en/latest/',
|
||||
'get_started': 'https://pytorch-lightning.rtfd.io/en/latest/',
|
||||
'features': 'https://pytorch-lightning.rtfd.io/en/latest/',
|
||||
'blog': 'https://pytorch-lightning.rtfd.io/en/latest/',
|
||||
'resources': 'https://pytorch-lightning.rtfd.io/en/latest/',
|
||||
'support': 'https://pytorch-lightning.rtfd.io/en/latest/',
|
||||
}
|
||||
-%}
|
||||
|
|
|
@ -150,7 +150,7 @@ html_theme_options = {
|
|||
'logo_only': False,
|
||||
}
|
||||
|
||||
html_logo = '_static/images/lightning_logo_small.png'
|
||||
html_logo = '_static/images/lightning_logo-name.svg'
|
||||
|
||||
# Add any paths that contain custom static files (such as style sheets) here,
|
||||
# relative to this directory. They are copied after the builtin static files,
|
||||
|
@ -303,7 +303,7 @@ autodoc_mock_imports = MOCK_REQUIRE_PACKAGES + MOCK_MANUAL_PACKAGES
|
|||
|
||||
# Options for the linkcode extension
|
||||
# ----------------------------------
|
||||
github_user = 'williamFalcon'
|
||||
github_user = 'PyTorchLightning'
|
||||
github_repo = project
|
||||
|
||||
|
||||
|
|
|
@ -3,13 +3,13 @@ Template model definition
|
|||
-------------------------
|
||||
|
||||
In 99% of cases you want to just copy `one of the examples
|
||||
<https://github.com/williamFalcon/pytorch-lightning/tree/master/pl_examples>`_
|
||||
<https://github.com/PyTorchLightning/pytorch-lightning/tree/master/pl_examples>`_
|
||||
to start a new lightningModule and change the core of what your model is actually trying to do.
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
# get a copy of the module template
|
||||
wget https://raw.githubusercontent.com/williamFalcon/pytorch-lightning/master/pl_examples/new_project_templates/lightning_module_template.py # noqa: E501
|
||||
wget https://raw.githubusercontent.com/PyTorchLightning/pytorch-lightning/master/pl_examples/new_project_templates/lightning_module_template.py # noqa: E501
|
||||
|
||||
|
||||
Trainer Example
|
||||
|
|
|
@ -1,8 +1,8 @@
|
|||
"""
|
||||
Example template for defining a system
|
||||
"""
|
||||
import os
|
||||
import logging
|
||||
import os
|
||||
from argparse import ArgumentParser
|
||||
from collections import OrderedDict
|
||||
|
||||
|
|
|
@ -8,20 +8,18 @@ from collections import OrderedDict
|
|||
|
||||
import torch
|
||||
import torch.backends.cudnn as cudnn
|
||||
import torch.nn.parallel
|
||||
import torch.nn.functional as F
|
||||
import torch.nn.parallel
|
||||
import torch.optim as optim
|
||||
import torch.optim.lr_scheduler as lr_scheduler
|
||||
import torch.utils.data
|
||||
import torch.utils.data.distributed
|
||||
|
||||
import torchvision.transforms as transforms
|
||||
import torchvision.models as models
|
||||
import torchvision.datasets as datasets
|
||||
import torchvision.models as models
|
||||
import torchvision.transforms as transforms
|
||||
|
||||
import pytorch_lightning as pl
|
||||
|
||||
|
||||
# pull out resnet names from torchvision models
|
||||
MODEL_NAMES = sorted(
|
||||
name for name in models.__dict__
|
||||
|
|
|
@ -5,7 +5,7 @@ __author__ = 'William Falcon et al.'
|
|||
__author_email__ = 'waf2107@columbia.edu'
|
||||
__license__ = 'Apache-2.0'
|
||||
__copyright__ = 'Copyright (c) 2018-2019, %s.' % __author__
|
||||
__homepage__ = 'https://github.com/williamFalcon/pytorch-lightning'
|
||||
__homepage__ = 'https://github.com/PyTorchLightning/pytorch-lightning'
|
||||
# this has to be simple string, see: https://github.com/pypa/twine/issues/522
|
||||
__docs__ = "PyTorch Lightning is the lightweight PyTorch wrapper for ML researchers." \
|
||||
" Scale your models. Write less boilerplate."
|
||||
|
@ -21,7 +21,7 @@ except NameError:
|
|||
|
||||
if __LIGHTNING_SETUP__:
|
||||
import sys
|
||||
sys.stderr.write('Partial import of skimage during the build process.\n')
|
||||
sys.stderr.write('Partial import of torchlightning during the build process.\n')
|
||||
# We are not importing the rest of the scikit during the build
|
||||
# process, as it may not be compiled yet
|
||||
else:
|
||||
|
|
|
@ -1,13 +1,15 @@
|
|||
"""
|
||||
Callbacks
|
||||
====================================
|
||||
=========
|
||||
|
||||
Callbacks supported by Lightning
|
||||
"""
|
||||
|
||||
import logging
|
||||
import os
|
||||
import shutil
|
||||
import logging
|
||||
import warnings
|
||||
|
||||
import numpy as np
|
||||
|
||||
from pytorch_lightning.overrides.data_parallel import LightningDistributedDataParallel
|
||||
|
@ -163,9 +165,7 @@ class EarlyStopping(Callback):
|
|||
|
||||
|
||||
class ModelCheckpoint(Callback):
|
||||
r"""
|
||||
|
||||
Save the model after every epoch.
|
||||
r"""Save the model after every epoch.
|
||||
|
||||
Args:
|
||||
filepath (str): path to save the model file.
|
||||
|
|
|
@ -1,21 +1,19 @@
|
|||
|
||||
|
||||
import os
|
||||
import warnings
|
||||
import collections
|
||||
import logging
|
||||
import pandas as pd
|
||||
import os
|
||||
import warnings
|
||||
from abc import ABC, abstractmethod
|
||||
from argparse import Namespace
|
||||
|
||||
import pandas as pd
|
||||
import torch
|
||||
import torch.distributed as dist
|
||||
#
|
||||
|
||||
from pytorch_lightning.core.decorators import data_loader
|
||||
from pytorch_lightning.core.grads import GradInformation
|
||||
from pytorch_lightning.core.hooks import ModelHooks
|
||||
from pytorch_lightning.core.saving import ModelIO
|
||||
from pytorch_lightning.core.memory import ModelSummary
|
||||
from pytorch_lightning.core.saving import ModelIO
|
||||
from pytorch_lightning.overrides.data_parallel import LightningDistributedDataParallel
|
||||
|
||||
|
||||
|
|
|
@ -3,13 +3,13 @@ Generates a summary of a model's layers and dimensionality
|
|||
'''
|
||||
|
||||
import gc
|
||||
import logging
|
||||
import os
|
||||
import subprocess
|
||||
|
||||
import numpy as np
|
||||
import pandas as pd
|
||||
import torch
|
||||
import logging
|
||||
|
||||
|
||||
class ModelSummary(object):
|
||||
|
|
|
@ -6,5 +6,3 @@ import warnings
|
|||
|
||||
warnings.warn("`root_module` module has been renamed to `lightning` since v0.6.0"
|
||||
" and will be removed in v0.8.0", DeprecationWarning)
|
||||
|
||||
from pytorch_lightning.core.lightning import LightningModule # noqa: E402
|
||||
|
|
|
@ -9,6 +9,7 @@ from torch.utils.data import DataLoader
|
|||
from torch.utils.data.distributed import DistributedSampler
|
||||
from torchvision import transforms
|
||||
from torchvision.datasets import MNIST
|
||||
|
||||
try:
|
||||
from test_tube import HyperOptArgumentParser
|
||||
except ImportError:
|
||||
|
|
|
@ -2,6 +2,7 @@ import warnings
|
|||
from abc import ABC
|
||||
|
||||
import torch.distributed as dist
|
||||
|
||||
try:
|
||||
# loading for pyTorch 1.3
|
||||
from torch.utils.data import IterableDataset
|
||||
|
|
|
@ -113,9 +113,9 @@ When the script starts again, Lightning will:
|
|||
|
||||
"""
|
||||
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import logging
|
||||
import warnings
|
||||
from abc import ABC, abstractmethod
|
||||
|
||||
|
|
|
@ -276,7 +276,7 @@ Instead of manually building SLURM scripts, you can use the
|
|||
|
||||
Here is an example where you run a grid search of 9 combinations of hyperparams.
|
||||
The full examples are `here
|
||||
<https://github.com/williamFalcon/pytorch-lightning/tree/master/pl_examples/new_project_templates/multi_node_examples>`_.
|
||||
<https://github.com/PyTorchLightning/pytorch-lightning/tree/master/pl_examples/new_project_templates/multi_node_examples>`_.
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
|
|
|
@ -123,10 +123,10 @@ In this second case, the options you pass to trainer will be used when running
|
|||
|
||||
"""
|
||||
|
||||
import sys
|
||||
from abc import ABC, abstractmethod
|
||||
|
||||
import torch
|
||||
import sys
|
||||
import tqdm
|
||||
|
||||
from pytorch_lightning.utilities.debugging import MisconfigurationException
|
||||
|
|
|
@ -1,9 +1,9 @@
|
|||
|
||||
|
||||
import logging
|
||||
import os
|
||||
import sys
|
||||
import warnings
|
||||
import logging
|
||||
|
||||
import torch
|
||||
import torch.distributed as dist
|
||||
|
@ -23,8 +23,8 @@ from pytorch_lightning.trainer.distrib_parts import (
|
|||
from pytorch_lightning.trainer.evaluation_loop import TrainerEvaluationLoopMixin
|
||||
from pytorch_lightning.trainer.logging import TrainerLoggingMixin
|
||||
from pytorch_lightning.trainer.model_hooks import TrainerModelHooksMixin
|
||||
from pytorch_lightning.trainer.training_loop import TrainerTrainLoopMixin
|
||||
from pytorch_lightning.trainer.training_io import TrainerIOMixin
|
||||
from pytorch_lightning.trainer.training_loop import TrainerTrainLoopMixin
|
||||
from pytorch_lightning.trainer.training_tricks import TrainerTrainingTricksMixin
|
||||
from pytorch_lightning.utilities.debugging import MisconfigurationException
|
||||
|
||||
|
|
|
@ -89,14 +89,16 @@ At a rough level, here's what happens inside Trainer :py:mod:`pytorch_lightning.
|
|||
|
||||
"""
|
||||
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import signal
|
||||
import warnings
|
||||
from subprocess import call
|
||||
import logging
|
||||
from abc import ABC
|
||||
from subprocess import call
|
||||
from argparse import Namespace
|
||||
|
||||
import pandas as pd
|
||||
import torch
|
||||
import torch.distributed as dist
|
||||
|
||||
|
@ -268,6 +270,7 @@ class TrainerIOMixin(ABC):
|
|||
torch.save(checkpoint, filepath)
|
||||
|
||||
def restore(self, checkpoint_path, on_gpu):
|
||||
|
||||
# if on_gpu:
|
||||
# checkpoint = torch.load(checkpoint_path)
|
||||
# else:
|
||||
|
|
|
@ -151,10 +151,10 @@ When this flag is enabled each batch is split into sequences of size truncated_b
|
|||
|
||||
|
||||
"""
|
||||
|
||||
import copy
|
||||
import inspect
|
||||
from abc import ABC, abstractmethod
|
||||
import warnings
|
||||
from abc import ABC, abstractmethod
|
||||
|
||||
import numpy as np
|
||||
|
||||
|
|
|
@ -1,7 +1,8 @@
|
|||
import logging
|
||||
from abc import ABC, abstractmethod
|
||||
|
||||
import torch
|
||||
import logging
|
||||
|
||||
from pytorch_lightning.callbacks import GradientAccumulationScheduler
|
||||
|
||||
|
||||
|
|
2
setup.py
|
@ -44,7 +44,7 @@ setup(
|
|||
author=pytorch_lightning.__author__,
|
||||
author_email=pytorch_lightning.__author_email__,
|
||||
url=pytorch_lightning.__homepage__,
|
||||
download_url='https://github.com/williamFalcon/pytorch-lightning',
|
||||
download_url='https://github.com/PyTorchLightning/pytorch-lightning',
|
||||
license=pytorch_lightning.__license__,
|
||||
packages=find_packages(exclude=['tests']),
|
||||
|
||||
|
|
|
@ -10,7 +10,7 @@ run on a 2-GPU machine to validate the full test-suite.
|
|||
|
||||
To run all tests do the following:
|
||||
```bash
|
||||
git clone https://github.com/williamFalcon/pytorch-lightning
|
||||
git clone https://github.com/PyTorchLightning/pytorch-lightning
|
||||
cd pytorch-lightning
|
||||
|
||||
# install module locally
|
||||
|
|