EveryDream2trainer/data
Victor Hall c0a1955164
Merge pull request #233 from damian0815/feat_the_acculmunator
plugin: grad accumulation scheduler
2023-11-05 19:53:11 -05:00
..
aspects.py add a batch_id.txt file to subfolders or a batch_id key to local yaml to force images for that folder to be processed in the same batch 2023-06-05 01:02:27 +02:00
data_loader.py fix chunked_shuffle crash with empty list 2023-06-17 11:04:14 +02:00
dataset.py put a file loss_scale.txt containing a float in a training folder to apply loss scale (eg -1 for negative examples) 2023-10-31 10:06:21 +01:00
ed_dl_wrap.py
every_dream.py Merge pull request #233 from damian0815/feat_the_acculmunator 2023-11-05 19:53:11 -05:00
every_dream_validation.py better check for null manual_data_root 2023-06-17 11:04:14 +02:00
image_train_item.py fix big when loss_scale.txt contains 0 2023-11-01 19:00:08 +01:00
latent_cache.py
resolver.py