d9316bf8bc
* Catch unused params in DDP * Fix proj_out, add test |
||
---|---|---|
.. | ||
experimental | ||
README.md | ||
train_unconditional.py |
README.md
Training examples
Unconditional Flowers
The command to train a DDPM UNet model on the Oxford Flowers dataset:
python -m torch.distributed.launch \
--nproc_per_node 4 \
train_unconditional.py \
--dataset="huggan/flowers-102-categories" \
--resolution=64 \
--output_dir="flowers-ddpm" \
--batch_size=16 \
--num_epochs=100 \
--gradient_accumulation_steps=1 \
--lr=1e-4 \
--warmup_steps=500 \
--mixed_precision=no
A full training run takes 2 hours on 4xV100 GPUs.
Unconditional Pokemon
The command to train a DDPM UNet model on the Pokemon dataset:
python -m torch.distributed.launch \
--nproc_per_node 4 \
train_unconditional.py \
--dataset="huggan/pokemon" \
--resolution=64 \
--output_dir="pokemon-ddpm" \
--batch_size=16 \
--num_epochs=100 \
--gradient_accumulation_steps=1 \
--lr=1e-4 \
--warmup_steps=500 \
--mixed_precision=no
A full training run takes 2 hours on 4xV100 GPUs.