diffusers/examples
anton-l a73ae3e5b0 Better default for AdamW 2022-07-21 13:36:16 +02:00
..
README.md Adapt training to the new UNet API 2022-07-21 11:07:21 +02:00
train_unconditional.py Better default for AdamW 2022-07-21 13:36:16 +02:00

README.md

Training examples

Installing the dependencies

Before running the scipts, make sure to install the library's training dependencies:

pip install diffusers[training] accelerate datasets

Unconditional Flowers

The command to train a DDPM UNet model on the Oxford Flowers dataset:

accelerate launch train_unconditional.py \
  --dataset="huggan/flowers-102-categories" \
  --resolution=64 \
  --output_dir="ddpm-ema-flowers-64" \
  --train_batch_size=16 \
  --num_epochs=100 \
  --gradient_accumulation_steps=1 \
  --learning_rate=1e-4 \
  --lr_warmup_steps=500 \
  --mixed_precision=no \
  --push_to_hub

A full training run takes 2 hours on 4xV100 GPUs.

Unconditional Pokemon

The command to train a DDPM UNet model on the Pokemon dataset:

accelerate launch train_unconditional.py \
  --dataset="huggan/pokemon" \
  --resolution=64 \
  --output_dir="ddpm-ema-pokemon-64" \
  --train_batch_size=16 \
  --num_epochs=100 \
  --gradient_accumulation_steps=1 \
  --learning_rate=1e-4 \
  --lr_warmup_steps=500 \
  --mixed_precision=no \
  --push_to_hub

A full training run takes 2 hours on 4xV100 GPUs.