EveryDream2trainer/utils
Augusto de la Torre 0716c40ab6 Add support for enhanced dataset configuration
Add support for:
* flip_p.txt
* cond_dropout.txt
* local.yaml consolidated config (including default captions)
* global.yaml consolidated config which applies recursively to subfolders
* flip_p, and cond_dropout config per image
* manifest.json with full image-level configuration
2023-03-08 15:02:14 +01:00
..
analyze_unet.py assume epsilon for compatibility with old diffusers converted files 2023-02-08 15:42:07 -05:00
convert_diff_to_ckpt.py issue with conv 2023-01-12 20:59:04 -05:00
convert_diffusers_to_stable_diffusion.py issue with conv 2023-01-12 20:59:04 -05:00
convert_original_stable_diffusion_to_diffusers.py Typo 2023-02-17 08:18:47 -06:00
fs_helpers.py Add support for enhanced dataset configuration 2023-03-08 15:02:14 +01:00
get_yamls.py bunch of updates, grad ckpting, no drop bucket, shuffle every epoch 2023-01-01 10:45:18 -05:00
gpu.py fix mem leak on huge data, rework optimizer to separate json, add lion optimizer 2023-02-25 15:05:22 -05:00
huggingface_downloader.py use StableDiffusionPipeline.from_pretrained() to download the model 2023-02-20 22:00:25 +01:00
isolate_rng.py GH-36: Add support for validation split (WIP) 2023-02-06 22:10:34 -08:00
log_wrapper.py merge 2023-01-15 22:07:37 -05:00
patch_bnb.py hey look ed2 2022-12-17 22:32:48 -05:00
sample_generator.py if progress bars are disabled, log a short message instead 2023-03-03 10:50:48 +01:00
split_dataset.py some validation tweaks and fixes 2023-02-19 10:20:16 +01:00