Commit Graph

144 Commits

Author SHA1 Message Date
Victor Hall 3cfecf8729 wip for bf16 testing 2023-06-15 13:54:46 -04:00
Victor Hall 6f64efaaaa
Merge pull request #193 from damian0815/feat_user_defined_batching
User defined batching
2023-06-10 13:08:19 -04:00
damian 59fc9891d4 shuffle named batches while respecting and accounting for grad_accum 2023-06-07 18:07:37 +02:00
Damian Stewart 53d0686086 add a batch_id.txt file to subfolders or a batch_id key to local yaml to force images for that folder to be processed in the same batch 2023-06-05 01:02:27 +02:00
Pat Shanahan e7d199e712
Fix spelling error 2023-06-04 10:13:37 -05:00
Victor Hall 1155a28867 zero terminal fixes 2023-06-03 21:41:56 -04:00
Victor Hall 81b7b00df7 use trained betas for zero terminal snr 2023-06-03 17:17:04 -04:00
Victor Hall a96c6e2166 dadapt stuff 2023-06-03 11:26:53 -04:00
Victor Hall 5c98cdee70 update to torch2, xformers 20, bnb 381 2023-05-30 22:15:02 -04:00
Victor Hall e6a176ea28 new bnb and beta zero terminal snr 2023-05-26 21:54:02 -04:00
Damian Stewart a6610625eb Squashed commit of the following:
commit 0f890f2d6bbccee225f738934f4c4450323f19a2
Merge: c008c40 003b089
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 14 11:47:40 2023 +0200

    Merge remote-tracking branch 'upstream/main' into feat_te_last_n_layers_unsquashed

commit c008c404f19ebc6b78085f42a4e39aeb2ba00d04
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 14 11:23:20 2023 +0200

    finalize TE layer freezing

commit 7377b10d59e32a6fea5d321a598ae4504e1a9f36
Author: Damian Stewart <d@damianstewart.com>
Date:   Thu May 11 20:45:28 2023 +0200

    remove zero_lr method

commit 4af13ba816c2811d7b5bd6fbb81a32bca6747e99
Author: Damian Stewart <d@damianstewart.com>
Date:   Thu May 11 20:05:01 2023 +0200

    Revert "rename parameters"

    This reverts commit aa33c61337599ab2d90b34aaf8c3d36fd4edf147.

commit aa33c61337599ab2d90b34aaf8c3d36fd4edf147
Author: Damian Stewart <d@damianstewart.com>
Date:   Tue May 9 00:28:00 2023 +0200

    rename parameters

commit 1da867e6fadb873da2571371a73b522406d76a18
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 22:28:29 2023 +0200

    remove silly check

commit 483cb2a635c3fe5a044edf4ea8de095bedc3f0ac
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 20:53:43 2023 +0200

    use 1e-10 not 0 as 'zero' lr

commit e5d230e6c765a7e25dc6381d09bd0a66a9a54ec2
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 20:51:51 2023 +0200

    add experimental 'zero_lr' freeze method

commit bcf24ee59a443c0ee71d622e65e1043b547f845e
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 17:32:11 2023 +0200

    fix layer selection bug

commit 7ee33eff8740e095f85042dcbb792e025b179c6c
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 17:25:25 2023 +0200

    put back the 'drop' method and make accessible

commit 76dfbf6dd6f43f3aa9a7f4629baa8e86573d9520
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 16:39:05 2023 +0200

    wip getting final_layer_norm to work

commit a19d43651a87525251106ed57238cd2cd1c3f3ff
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 16:15:53 2023 +0200

    work around a crash when freeze_final_layer_norm is True

commit c2a44eb25132941b92e2ecd0be3682ae3c6838c2
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 15:47:10 2023 +0200

    improve logging, add extra freezing controls

commit a31e64c4c0d12dfb6583dd6f22c8c09ba7840410
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 13:46:38 2023 +0200

    alternative method to freeze early TE layers

commit 095692fd4ea53707c012217898321860d8b9329f
Merge: 876072c 4c5ce81
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 11:52:51 2023 +0200

    Merge branch 'victorchall:main' into feat_te_last_n_layers

commit 876072c46394fde721a6026f7a6ef72ccb150ddb
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 01:41:50 2023 +0200

    implement last N layers training only for TE
2023-05-16 13:57:05 -04:00
Damian Stewart f3468fe7e7 Merge remote-tracking branch 'upstream/main' into val_partial_epochs 2023-05-07 02:11:58 +02:00
Victor Hall cb511025ed finalizing optimizer split 2023-05-04 20:15:36 -04:00
Victor Hall 4e81a0eb55 optimizer split works 2023-05-04 20:15:36 -04:00
Victor Hall 970065c206 more wip optimizer splitting 2023-05-04 20:15:36 -04:00
Victor Hall 72a47741f0 optimizer spltting 2023-05-04 20:15:36 -04:00
Victor Hall 3639e36135 refactor optimizer to split te and unet 2023-05-04 20:15:36 -04:00
Victor Hall f0449c64e7 oops 2023-04-30 11:52:30 -04:00
Victor Hall 5d0b2ff24f remove notebook arg 2023-04-30 09:29:29 -04:00
Victor Hall 29a19fd8b1 check if your git commit is out of date 2023-04-29 18:15:25 -04:00
Damian Stewart aad00eab2e switch up errored item logic flow 2023-04-29 13:03:10 -04:00
Damian Stewart ce85ce30ae warn on chronically underfilled aspect ratio buckets 2023-04-29 13:03:10 -04:00
Victor Hall 743c7cccae print args after cleaning, set attn slicing for sd15 if not using amp 2023-04-16 18:48:44 -04:00
Victor Hall e3e30a5599 fix exif issues 2023-04-16 14:53:34 -04:00
Victor Hall d25b9661af fix bug in optimizer path 2023-04-15 13:27:12 -04:00
Victor Hall e574805326
Merge pull request #140 from damian0815/fix_samples_clip_skip
Respect clip_skip when generating samples
2023-04-14 21:30:23 -04:00
Damian Stewart 9b663cd23e use compel for sample prompting; enable clip skip for samples 2023-04-14 19:12:06 +02:00
tyler ecfeec80a7 support for saving the optimizer state 2023-04-13 22:31:22 -05:00
Victor Hall 35d52b56e0 added some resolutions, option for val-loss pos-neg, fix wandb 2023-03-25 20:09:06 -04:00
Victor Hall f01a7354f0 remove sort from dataset due to slowdown on large sets, add contribution readme 2023-03-18 22:24:03 -04:00
Victor Hall 59f14656fe make sure undersized image is utf8 2023-03-15 22:14:45 -04:00
Victor Hall ba687de8b4 add pbar back to preloading, remove cruft from testing loss stuff 2023-03-15 12:06:29 -04:00
Victor Hall d1bc94fe3e cruft left from experiment 2023-03-15 11:23:47 -04:00
Victor Hall 9389a90c67 defaulting amp to on now 2023-03-10 21:35:47 -05:00
Damian Stewart c913824979 fix every_n_epoch>1 logic and remove unnecessary log 2023-03-10 23:13:53 +01:00
Damian Stewart 103ab20696 allow sub-epoch validation when every_n_epochs <1 2023-03-09 20:42:17 +01:00
Damian Stewart 644c9e6b2a log to logging.info instead of stdout 2023-03-03 09:52:44 +01:00
Damian Stewart 15187ae2e2 fix log leaking color 2023-03-02 22:53:59 +01:00
Damian Stewart e2fd45737d overwrite args.seed with the actual seed if -1 is passed (so it appears in tensorboard)
also improve logging when unet training is disabled
2023-03-02 22:52:33 +01:00
Damian Stewart 97a8a49773 don't log separate text encoder LR if it's the same as unet LR 2023-03-02 22:36:32 +01:00
damian 61558be2ae logging and progress bar improvements 2023-03-02 18:29:28 +01:00
Damian Stewart 8100e42159 fix issues and improve sample generator 2023-03-02 13:03:50 +01:00
Damian Stewart c82664b3f3 add text encoder LR setting to optimizer.json 2023-03-02 00:13:43 +01:00
Victor Hall ba87b0cae1 log how lr was derived 2023-03-01 12:43:32 -05:00
Victor Hall 600eaa404d make sure main lr arg overrides optimizer.json 2023-03-01 12:26:36 -05:00
Victor Hall 6c87af7711 doc, fix lr defaulting bug, notdreambooth.md, read multicaption from .txt by line 2023-02-26 19:11:42 -05:00
Victor Hall f4999281ac
Merge pull request #93 from victorchall/lion
log lr with rest of optimizer settings to console, put lr into optimi…
2023-02-25 16:24:27 -05:00
Victor Hall b7de987f4c log lr with rest of optimizer settings to console, put lr into optimizer.json default 2023-02-25 16:23:33 -05:00
Victor Hall e7fc71ffa1
Merge pull request #92 from victorchall/lion
fix mem leak on huge data, rework optimizer to separate json, add lio…
2023-02-25 16:20:10 -05:00
Victor Hall a9b0189947 fix mem leak on huge data, rework optimizer to separate json, add lion optimizer 2023-02-25 15:05:22 -05:00