lightning/dockers
Jirka Borovec 5119013c81 drop install FairScale for TPU (#5113)
* drop install FairScale for TPU

* typo

Co-authored-by: Roger Shieh <sh.rog@protonmail.ch>
2021-01-05 09:58:37 +01:00
..
base-conda drop fairscale for PT <= 1.4 (#4910) 2020-11-30 23:19:30 +00:00
base-cuda drop fairscale for PT <= 1.4 (#4910) 2020-11-30 23:19:30 +00:00
base-xla drop install FairScale for TPU (#5113) 2021-01-05 09:58:37 +01:00
release Drone: use nightly build cuda docker images (#3658) 2020-10-26 10:47:09 +00:00
tpu-tests drop install FairScale for TPU (#5113) 2021-01-05 09:58:37 +01:00
README.md [dockers] install nvidia-dali-cudaXXX (#4532) 2020-11-09 21:18:24 +06:30

README.md

Docker images

Builds images form attached Dockerfiles

You can build it on your own, note it takes lots of time, be prepared.

git clone <git-repository>
docker image build -t pytorch-lightning:latest -f dockers/conda/Dockerfile .

or with specific arguments

git clone <git-repository>
docker image build \
    -t pytorch-lightning:py3.8-pt1.6 \
    -f dockers/base-cuda/Dockerfile \
    --build-arg PYTHON_VERSION=3.8 \
    --build-arg PYTORCH_VERSION=1.6 \
    .

To run your docker use

docker image list
docker run --rm -it pytorch-lightning:latest bash

and if you do not need it anymore, just clean it:

docker image list
docker image rm pytorch-lightning:latest

Run docker image with GPUs

To run docker image with access to you GPUs you need to install

# Add the package repositories
distribution=$(. /etc/os-release;echo $ID$VERSION_ID)
curl -s -L https://nvidia.github.io/nvidia-docker/gpgkey | sudo apt-key add -
curl -s -L https://nvidia.github.io/nvidia-docker/$distribution/nvidia-docker.list | sudo tee /etc/apt/sources.list.d/nvidia-docker.list

sudo apt-get update && sudo apt-get install -y nvidia-container-toolkit
sudo systemctl restart docker

and later run the docker image with --gpus all so for example

docker run --rm -it --gpus all pytorchlightning/pytorch_lightning:base-cuda-py3.7-torch1.6