EveryDream2trainer/data
Damian Stewart 86aaf1c4d7
fix big when loss_scale.txt contains 0
2023-11-01 19:00:08 +01:00
..
aspects.py add a batch_id.txt file to subfolders or a batch_id key to local yaml to force images for that folder to be processed in the same batch 2023-06-05 01:02:27 +02:00
data_loader.py fix chunked_shuffle crash with empty list 2023-06-17 11:04:14 +02:00
dataset.py put a file loss_scale.txt containing a float in a training folder to apply loss scale (eg -1 for negative examples) 2023-10-31 10:06:21 +01:00
ed_dl_wrap.py hey look ed2 2022-12-17 22:32:48 -05:00
every_dream.py put a file loss_scale.txt containing a float in a training folder to apply loss scale (eg -1 for negative examples) 2023-10-31 10:06:21 +01:00
every_dream_validation.py better check for null manual_data_root 2023-06-17 11:04:14 +02:00
image_train_item.py fix big when loss_scale.txt contains 0 2023-11-01 19:00:08 +01:00
latent_cache.py hey look ed2 2022-12-17 22:32:48 -05:00
resolver.py Add support for enhanced dataset configuration 2023-03-08 15:02:14 +01:00