Commit Graph

164 Commits

Author SHA1 Message Date
Victor Hall a9c98f5866 bugged flag 2023-09-22 10:16:56 -04:00
Victor Hall 166c2e74e1 off by one on last epoch save 2023-09-21 21:29:36 -04:00
Victor Hall 09aa13c3dd
Merge branch 'main' into feat_rolling_save 2023-09-20 16:32:37 -04:00
Victor Hall 2dff3aa8d1 ema update 2023-09-18 16:13:22 -04:00
Damian Stewart a68ebe3658 fix typo 2023-09-17 20:17:54 +02:00
Damian Stewart 3fddef3698 put back make_save_path and fix error in plugin runner 2023-09-17 20:16:26 +02:00
Victor Hall fa5b38e26b some minor updates to ema 2023-09-12 21:37:27 -04:00
alexds9 7259ce873b 1. Samples format change to make sure global step appear before "ema" indication. 2023-09-11 00:13:26 +03:00
alexds9 39b3082bf4 1. Making sure to release VRAM in samples. 2023-09-10 22:42:01 +03:00
alexds9 d2d493c911 1. New parameters added to train.json and trainSD21.json - disabled by default.
2. Description added to ADVANCED_TWEAKING.md
2023-09-10 20:06:50 +03:00
alexds9 5b1760fff2 1. Added an argument ema_decay_resume_model to load EMA model - it's loaded alongside main model, instead of copying normal model. It's optional, without loaded EMA model, it will copy the regular model to me the first EMA model, just like before.
2. Fixed findlast option for regular models not to load EMA models by default.
3. findlast can be used to load EMA model too when used with ema_decay_resume_model.
4. Added ema_device variable to store the device in torch type.
5. Cleaned prints and comments.
2023-09-07 19:53:20 +03:00
alexds9 cf4a082e11 1. Fix to EMA samples arguments not respecting False value. 2023-09-06 23:04:12 +03:00
alexds9 5bcf9407f0 1. Improved EMA support: samples generation with arguments EMA/NOT-EMA, saving checkpoints and diffusers for both, ema_decay_target implemented.
2. enable_zero_terminal_snr separated from zero_frequency_noise_ratio.
2023-09-06 22:37:10 +03:00
alexds9 23df727a1f Added support for:
1. EMA decay. Using EMA decay model, it is updated every ema_decay_interval by (1 - ema_decay_rate), it can be stored on CPU to save VRAM. Only EMA model is saved now.
2. min_snr_gamma - improve converging speed, more info: https://arxiv.org/abs/2303.09556
3. load_settings_every_epoch - Will load 'train.json' at start of every epoch.
2023-09-06 13:38:52 +03:00
Damian Stewart 9b5b96a50b fixes for ZTSNR training 2023-08-15 20:49:28 +02:00
Victor Hall 8007869b84 log exception if something blows up so it ends up in the .log 2023-07-05 16:02:18 -04:00
Victor Hall 42c417171d improve plugins 2023-07-04 17:29:22 -04:00
Victor Hall 1afaf59ec9 fix default for plugins 2023-06-29 22:00:16 -04:00
Victor Hall a72d455fc5 missed issue with plugin 2023-06-29 20:44:14 -04:00
Victor Hall aa7e004869 first crack at plugins 2023-06-27 20:53:48 -04:00
Damian Stewart 227f56427b write correct epoch number of final save, and add flags to disable grad scaler tweaks, last epoch renaming, ckpt save 2023-06-17 19:18:04 +02:00
Victor Hall 6f64efaaaa
Merge pull request #193 from damian0815/feat_user_defined_batching
User defined batching
2023-06-10 13:08:19 -04:00
damian 59fc9891d4 shuffle named batches while respecting and accounting for grad_accum 2023-06-07 18:07:37 +02:00
Damian Stewart 53d0686086 add a batch_id.txt file to subfolders or a batch_id key to local yaml to force images for that folder to be processed in the same batch 2023-06-05 01:02:27 +02:00
Pat Shanahan e7d199e712
Fix spelling error 2023-06-04 10:13:37 -05:00
Victor Hall 1155a28867 zero terminal fixes 2023-06-03 21:41:56 -04:00
Victor Hall 81b7b00df7 use trained betas for zero terminal snr 2023-06-03 17:17:04 -04:00
Victor Hall a96c6e2166 dadapt stuff 2023-06-03 11:26:53 -04:00
Victor Hall 5c98cdee70 update to torch2, xformers 20, bnb 381 2023-05-30 22:15:02 -04:00
Victor Hall e6a176ea28 new bnb and beta zero terminal snr 2023-05-26 21:54:02 -04:00
Damian Stewart a6610625eb Squashed commit of the following:
commit 0f890f2d6bbccee225f738934f4c4450323f19a2
Merge: c008c40 003b089
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 14 11:47:40 2023 +0200

    Merge remote-tracking branch 'upstream/main' into feat_te_last_n_layers_unsquashed

commit c008c404f19ebc6b78085f42a4e39aeb2ba00d04
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 14 11:23:20 2023 +0200

    finalize TE layer freezing

commit 7377b10d59e32a6fea5d321a598ae4504e1a9f36
Author: Damian Stewart <d@damianstewart.com>
Date:   Thu May 11 20:45:28 2023 +0200

    remove zero_lr method

commit 4af13ba816c2811d7b5bd6fbb81a32bca6747e99
Author: Damian Stewart <d@damianstewart.com>
Date:   Thu May 11 20:05:01 2023 +0200

    Revert "rename parameters"

    This reverts commit aa33c61337599ab2d90b34aaf8c3d36fd4edf147.

commit aa33c61337599ab2d90b34aaf8c3d36fd4edf147
Author: Damian Stewart <d@damianstewart.com>
Date:   Tue May 9 00:28:00 2023 +0200

    rename parameters

commit 1da867e6fadb873da2571371a73b522406d76a18
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 22:28:29 2023 +0200

    remove silly check

commit 483cb2a635c3fe5a044edf4ea8de095bedc3f0ac
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 20:53:43 2023 +0200

    use 1e-10 not 0 as 'zero' lr

commit e5d230e6c765a7e25dc6381d09bd0a66a9a54ec2
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 20:51:51 2023 +0200

    add experimental 'zero_lr' freeze method

commit bcf24ee59a443c0ee71d622e65e1043b547f845e
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 17:32:11 2023 +0200

    fix layer selection bug

commit 7ee33eff8740e095f85042dcbb792e025b179c6c
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 17:25:25 2023 +0200

    put back the 'drop' method and make accessible

commit 76dfbf6dd6f43f3aa9a7f4629baa8e86573d9520
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 16:39:05 2023 +0200

    wip getting final_layer_norm to work

commit a19d43651a87525251106ed57238cd2cd1c3f3ff
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 16:15:53 2023 +0200

    work around a crash when freeze_final_layer_norm is True

commit c2a44eb25132941b92e2ecd0be3682ae3c6838c2
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 15:47:10 2023 +0200

    improve logging, add extra freezing controls

commit a31e64c4c0d12dfb6583dd6f22c8c09ba7840410
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 13:46:38 2023 +0200

    alternative method to freeze early TE layers

commit 095692fd4ea53707c012217898321860d8b9329f
Merge: 876072c 4c5ce81
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 11:52:51 2023 +0200

    Merge branch 'victorchall:main' into feat_te_last_n_layers

commit 876072c46394fde721a6026f7a6ef72ccb150ddb
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 01:41:50 2023 +0200

    implement last N layers training only for TE
2023-05-16 13:57:05 -04:00
Damian Stewart f3468fe7e7 Merge remote-tracking branch 'upstream/main' into val_partial_epochs 2023-05-07 02:11:58 +02:00
Victor Hall cb511025ed finalizing optimizer split 2023-05-04 20:15:36 -04:00
Victor Hall 4e81a0eb55 optimizer split works 2023-05-04 20:15:36 -04:00
Victor Hall 970065c206 more wip optimizer splitting 2023-05-04 20:15:36 -04:00
Victor Hall 72a47741f0 optimizer spltting 2023-05-04 20:15:36 -04:00
Victor Hall 3639e36135 refactor optimizer to split te and unet 2023-05-04 20:15:36 -04:00
Victor Hall f0449c64e7 oops 2023-04-30 11:52:30 -04:00
Victor Hall 5d0b2ff24f remove notebook arg 2023-04-30 09:29:29 -04:00
Victor Hall 29a19fd8b1 check if your git commit is out of date 2023-04-29 18:15:25 -04:00
Damian Stewart aad00eab2e switch up errored item logic flow 2023-04-29 13:03:10 -04:00
Damian Stewart ce85ce30ae warn on chronically underfilled aspect ratio buckets 2023-04-29 13:03:10 -04:00
Victor Hall 743c7cccae print args after cleaning, set attn slicing for sd15 if not using amp 2023-04-16 18:48:44 -04:00
Victor Hall e3e30a5599 fix exif issues 2023-04-16 14:53:34 -04:00
Victor Hall d25b9661af fix bug in optimizer path 2023-04-15 13:27:12 -04:00
Victor Hall e574805326
Merge pull request #140 from damian0815/fix_samples_clip_skip
Respect clip_skip when generating samples
2023-04-14 21:30:23 -04:00
Damian Stewart 9b663cd23e use compel for sample prompting; enable clip skip for samples 2023-04-14 19:12:06 +02:00
tyler ecfeec80a7 support for saving the optimizer state 2023-04-13 22:31:22 -05:00
Victor Hall 35d52b56e0 added some resolutions, option for val-loss pos-neg, fix wandb 2023-03-25 20:09:06 -04:00
Victor Hall f01a7354f0 remove sort from dataset due to slowdown on large sets, add contribution readme 2023-03-18 22:24:03 -04:00