Commit Graph

188 Commits

Author SHA1 Message Date
Victor Hall 81b7b00df7 use trained betas for zero terminal snr 2023-06-03 17:17:04 -04:00
Victor Hall a96c6e2166 dadapt stuff 2023-06-03 11:26:53 -04:00
Victor Hall 5c98cdee70 update to torch2, xformers 20, bnb 381 2023-05-30 22:15:02 -04:00
Victor Hall e6a176ea28 new bnb and beta zero terminal snr 2023-05-26 21:54:02 -04:00
Damian Stewart a6610625eb Squashed commit of the following:
commit 0f890f2d6bbccee225f738934f4c4450323f19a2
Merge: c008c40 003b089
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 14 11:47:40 2023 +0200

    Merge remote-tracking branch 'upstream/main' into feat_te_last_n_layers_unsquashed

commit c008c404f19ebc6b78085f42a4e39aeb2ba00d04
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 14 11:23:20 2023 +0200

    finalize TE layer freezing

commit 7377b10d59e32a6fea5d321a598ae4504e1a9f36
Author: Damian Stewart <d@damianstewart.com>
Date:   Thu May 11 20:45:28 2023 +0200

    remove zero_lr method

commit 4af13ba816c2811d7b5bd6fbb81a32bca6747e99
Author: Damian Stewart <d@damianstewart.com>
Date:   Thu May 11 20:05:01 2023 +0200

    Revert "rename parameters"

    This reverts commit aa33c61337599ab2d90b34aaf8c3d36fd4edf147.

commit aa33c61337599ab2d90b34aaf8c3d36fd4edf147
Author: Damian Stewart <d@damianstewart.com>
Date:   Tue May 9 00:28:00 2023 +0200

    rename parameters

commit 1da867e6fadb873da2571371a73b522406d76a18
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 22:28:29 2023 +0200

    remove silly check

commit 483cb2a635c3fe5a044edf4ea8de095bedc3f0ac
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 20:53:43 2023 +0200

    use 1e-10 not 0 as 'zero' lr

commit e5d230e6c765a7e25dc6381d09bd0a66a9a54ec2
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 20:51:51 2023 +0200

    add experimental 'zero_lr' freeze method

commit bcf24ee59a443c0ee71d622e65e1043b547f845e
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 17:32:11 2023 +0200

    fix layer selection bug

commit 7ee33eff8740e095f85042dcbb792e025b179c6c
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 17:25:25 2023 +0200

    put back the 'drop' method and make accessible

commit 76dfbf6dd6f43f3aa9a7f4629baa8e86573d9520
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 16:39:05 2023 +0200

    wip getting final_layer_norm to work

commit a19d43651a87525251106ed57238cd2cd1c3f3ff
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 16:15:53 2023 +0200

    work around a crash when freeze_final_layer_norm is True

commit c2a44eb25132941b92e2ecd0be3682ae3c6838c2
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 15:47:10 2023 +0200

    improve logging, add extra freezing controls

commit a31e64c4c0d12dfb6583dd6f22c8c09ba7840410
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 13:46:38 2023 +0200

    alternative method to freeze early TE layers

commit 095692fd4ea53707c012217898321860d8b9329f
Merge: 876072c 4c5ce81
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 11:52:51 2023 +0200

    Merge branch 'victorchall:main' into feat_te_last_n_layers

commit 876072c46394fde721a6026f7a6ef72ccb150ddb
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 01:41:50 2023 +0200

    implement last N layers training only for TE
2023-05-16 13:57:05 -04:00
Damian Stewart f3468fe7e7 Merge remote-tracking branch 'upstream/main' into val_partial_epochs 2023-05-07 02:11:58 +02:00
Victor Hall cb511025ed finalizing optimizer split 2023-05-04 20:15:36 -04:00
Victor Hall 4e81a0eb55 optimizer split works 2023-05-04 20:15:36 -04:00
Victor Hall 970065c206 more wip optimizer splitting 2023-05-04 20:15:36 -04:00
Victor Hall 72a47741f0 optimizer spltting 2023-05-04 20:15:36 -04:00
Victor Hall 3639e36135 refactor optimizer to split te and unet 2023-05-04 20:15:36 -04:00
Victor Hall f0449c64e7 oops 2023-04-30 11:52:30 -04:00
Victor Hall 5d0b2ff24f remove notebook arg 2023-04-30 09:29:29 -04:00
Victor Hall 29a19fd8b1 check if your git commit is out of date 2023-04-29 18:15:25 -04:00
Damian Stewart aad00eab2e switch up errored item logic flow 2023-04-29 13:03:10 -04:00
Damian Stewart ce85ce30ae warn on chronically underfilled aspect ratio buckets 2023-04-29 13:03:10 -04:00
Victor Hall 743c7cccae print args after cleaning, set attn slicing for sd15 if not using amp 2023-04-16 18:48:44 -04:00
Victor Hall e3e30a5599 fix exif issues 2023-04-16 14:53:34 -04:00
Victor Hall d25b9661af fix bug in optimizer path 2023-04-15 13:27:12 -04:00
Victor Hall e574805326
Merge pull request #140 from damian0815/fix_samples_clip_skip
Respect clip_skip when generating samples
2023-04-14 21:30:23 -04:00
Damian Stewart 9b663cd23e use compel for sample prompting; enable clip skip for samples 2023-04-14 19:12:06 +02:00
tyler ecfeec80a7 support for saving the optimizer state 2023-04-13 22:31:22 -05:00
Victor Hall 35d52b56e0 added some resolutions, option for val-loss pos-neg, fix wandb 2023-03-25 20:09:06 -04:00
Victor Hall f01a7354f0 remove sort from dataset due to slowdown on large sets, add contribution readme 2023-03-18 22:24:03 -04:00
Victor Hall 59f14656fe make sure undersized image is utf8 2023-03-15 22:14:45 -04:00
Victor Hall ba687de8b4 add pbar back to preloading, remove cruft from testing loss stuff 2023-03-15 12:06:29 -04:00
Victor Hall d1bc94fe3e cruft left from experiment 2023-03-15 11:23:47 -04:00
Victor Hall 9389a90c67 defaulting amp to on now 2023-03-10 21:35:47 -05:00
Damian Stewart c913824979 fix every_n_epoch>1 logic and remove unnecessary log 2023-03-10 23:13:53 +01:00
Damian Stewart 103ab20696 allow sub-epoch validation when every_n_epochs <1 2023-03-09 20:42:17 +01:00
Damian Stewart 644c9e6b2a log to logging.info instead of stdout 2023-03-03 09:52:44 +01:00
Damian Stewart 15187ae2e2 fix log leaking color 2023-03-02 22:53:59 +01:00
Damian Stewart e2fd45737d overwrite args.seed with the actual seed if -1 is passed (so it appears in tensorboard)
also improve logging when unet training is disabled
2023-03-02 22:52:33 +01:00
Damian Stewart 97a8a49773 don't log separate text encoder LR if it's the same as unet LR 2023-03-02 22:36:32 +01:00
damian 61558be2ae logging and progress bar improvements 2023-03-02 18:29:28 +01:00
Damian Stewart 8100e42159 fix issues and improve sample generator 2023-03-02 13:03:50 +01:00
Damian Stewart c82664b3f3 add text encoder LR setting to optimizer.json 2023-03-02 00:13:43 +01:00
Victor Hall ba87b0cae1 log how lr was derived 2023-03-01 12:43:32 -05:00
Victor Hall 600eaa404d make sure main lr arg overrides optimizer.json 2023-03-01 12:26:36 -05:00
Victor Hall 6c87af7711 doc, fix lr defaulting bug, notdreambooth.md, read multicaption from .txt by line 2023-02-26 19:11:42 -05:00
Victor Hall f4999281ac
Merge pull request #93 from victorchall/lion
log lr with rest of optimizer settings to console, put lr into optimi…
2023-02-25 16:24:27 -05:00
Victor Hall b7de987f4c log lr with rest of optimizer settings to console, put lr into optimizer.json default 2023-02-25 16:23:33 -05:00
Victor Hall e7fc71ffa1
Merge pull request #92 from victorchall/lion
fix mem leak on huge data, rework optimizer to separate json, add lio…
2023-02-25 16:20:10 -05:00
Victor Hall a9b0189947 fix mem leak on huge data, rework optimizer to separate json, add lion optimizer 2023-02-25 15:05:22 -05:00
Damian Stewart 4646af7434 use StableDiffusionPipeline.from_pretrained() to download the model 2023-02-20 22:00:25 +01:00
Damian Stewart 60702ee040 some validation tweaks and fixes 2023-02-19 10:20:16 +01:00
Damian Stewart 9365d5a7b2 restore saving on error 2023-02-19 00:21:20 +01:00
Victor Hall 80c6883729 default to 2perc zero freq noise 2023-02-18 15:17:08 -05:00
Victor Hall c071230a49
Merge pull request #75 from damian0815/sample_generation_refactor_redo
Refactor sample generation and introduce sample_prompts.json
2023-02-18 14:42:55 -05:00
Damian Stewart 230cab9e27 isolate RNG in sample generation 2023-02-18 20:18:21 +01:00
Victor Hall 2f48460691 fix ckpt save if only 1 step per epoch, revert epsilon for amp 2023-02-18 11:23:02 -05:00
Damian Stewart 648fe20200 better default batch size handling 2023-02-18 15:55:54 +01:00
Damian Stewart e97f0816db Squashed commit of the following:
commit 86fa1363852850e87be11e5a277b71435f6a3451
Author: Damian Stewart <d@damianstewart.com>
Date:   Sat Feb 18 14:43:57 2023 +0100

    cleanup, add back random caption support

commit f9a10842b47b9a5d51d53de8d56cb7089a1eeeb2
Author: Damian Stewart <d@damianstewart.com>
Date:   Sat Feb 18 13:52:22 2023 +0100

    misc fixes and documentation

commit 46167806892258fef509f14e9d83ceab08725cd6
Author: Damian Stewart <d@damianstewart.com>
Date:   Sat Feb 18 12:11:18 2023 +0100

    works

commit 390bcdf4d8165315e2f84404c62b410c7b674c84
Author: Damian Stewart <d@damianstewart.com>
Date:   Sat Feb 18 10:12:14 2023 +0100

    SampleGenerator code in place (untested)

commit 022724fa7a435371081fd489ee7e5dbfc2df37ec
Author: Damian Stewart <d@damianstewart.com>
Date:   Sat Feb 18 10:17:05 2023 +0100

    cleanup and new approach (untested)

commit 4ac81f0924146a7ac3c46f4a4382e7dceaaac47c
Author: Damian Stewart <d@damianstewart.com>
Date:   Fri Jan 27 17:26:12 2023 +0100

    fix 'classmethod is not callable' error

commit c875933096464a867a5c3cfbf9592605f201f79e
Author: Damian Stewart <d@damianstewart.com>
Date:   Fri Jan 27 17:10:03 2023 +0100

    fix prompts log crash

commit 2771d52485191388dfa5b3b8892ed7327d874ed6
Author: Damian Stewart <d@damianstewart.com>
Date:   Fri Jan 27 14:38:39 2023 +0100

    fix circular import

commit 8452272b02fe64a2345fba067a55e51c52debd98
Author: Damian Stewart <d@damianstewart.com>
Date:   Fri Jan 27 14:33:26 2023 +0100

    refactor sample generation (untested)
2023-02-18 15:51:50 +01:00
Victor Hall 37cf437a5f zero frequency noise option to improve contrast 2023-02-15 18:53:08 -05:00
Victor Hall e28b23851d save_ckpts_from_n_epochs arg to skip early saving of ckpts 2023-02-13 12:18:33 -05:00
Victor Hall 2353c4c16a validation doc update 2023-02-08 16:05:19 -05:00
Victor Hall 8b13c7ed1f val... 2023-02-08 13:34:49 -05:00
Victor Hall 27b66ab54f ability to not do val 2023-02-08 13:23:19 -05:00
Victor Hall 8a8a4cf3df make val optional, revert multiply algo 2023-02-08 13:04:12 -05:00
Damian Stewart 19347bcaa8 make fractional multiplier logic apply per-directory 2023-02-08 14:15:54 +01:00
Damian Stewart a7b00e9ef3 fix multiplier logic 2023-02-08 13:46:58 +01:00
Damian Stewart 4e37200dda fix multiplier issues with validation and refactor validation logic 2023-02-08 11:28:45 +01:00
Victor Hall 00ae25ae60
Merge branch 'main' into feat_cli_args_override_json_file 2023-02-07 20:31:01 -05:00
Victor Hall 6f4bfdc557 temporarily disable val, issues 2023-02-07 18:43:16 -05:00
damian 29396ec21b update EveryDreamValidator for noprompt's changes 2023-02-07 17:46:30 +01:00
Damian Stewart e0ca75cc96 remove redundant update_old_args() function 2023-02-07 13:46:19 +01:00
Damian Stewart 21a64c38f2 move collate_fn to top level to possibly fix windows issue 2023-02-06 19:03:33 +01:00
Damian Stewart c270dbf6a8 ensure dataloader workers exit cleanly on ctrl-c 2023-02-06 19:03:33 +01:00
Damian Stewart 4b5654452c more workers 2023-02-06 19:03:33 +01:00
Damian Stewart 86a5004098 better main-thread detection 2023-02-06 19:03:33 +01:00
Damian Stewart 50a71e63b6 background load images for a 40% performance improvement 2023-02-06 19:03:33 +01:00
Damian Stewart 927880e1fc allow cli args to override config values 2023-02-06 08:16:08 +01:00
Victor Hall d99b3b1d9b
Merge pull request #38 from noprompt/push-dlma-into-main
Push DLMA into `main`, improvements to `data.resolve`
2023-02-05 08:20:10 -05:00
Joel Holdbrooks c8c658d181 Forgot to pass args to write_batch_schedule 2023-01-29 18:28:07 -08:00
Joel Holdbrooks f96d44ddb4 Move image resolution into its own function 2023-01-29 18:20:40 -08:00
Joel Holdbrooks 56f130c027 Forgot to add train.py earlier 🤦; move write_batch_schedule to train.py 2023-01-29 18:11:34 -08:00
Damian Stewart 3aa9139b4b re-read sample prompts every time they're generated 2023-01-30 00:51:03 +01:00
Victor Hall bc273d0512 change training example to constant lr 2023-01-28 18:20:04 -05:00
Victor Hall 94feea99ee fix colab notebook 2023-01-25 20:55:24 -05:00
Damian Stewart 448848d5c2
fix `'int' object is not callable` warning 2023-01-25 11:06:52 +01:00
Victor Hall 4433bd7806
Merge pull request #19 from janekm/wandb_fix
Fix wandb setup
2023-01-23 17:38:49 -08:00
Damian Stewart 067ea506a2 check for local files before downloading from HF 2023-01-23 21:10:04 +01:00
Damian Stewart 7a78ad0dfa prevent a crash if hf_repo_subfolder is unset 2023-01-23 19:38:47 +01:00
Damian Stewart 4f3b4d7dda clarify code comments 2023-01-23 19:20:37 +01:00
Damian Stewart d24dd681c0 Merge remote-tracking branch 'upstream/main' into hf_model_download 2023-01-23 19:19:22 +01:00
Victor Hall 9f5f773c33 remove saving yaml for sd1x models, unneeded 2023-01-22 21:43:03 -05:00
Victor Hall 18d1da0459 bug fix and multiplytxt fraction stuff 2023-01-22 18:59:59 -05:00
Victor Hall 24b00ab35b add fractional support for multiply.txt 2023-01-22 01:15:50 -05:00
Janek Mann ba5706c5d8
Fix wandb setup
Still need two have a log_writer
2023-01-21 23:26:04 +00:00
Victor Hall 36ece59660 better undersized log file 2023-01-20 16:23:56 -05:00
Victor Hall 1c2708dc63 few sanity checks and remove keyboard 2023-01-20 09:42:24 -05:00
Victor Hall 23faf05512 save yaml with ckpt files for easier loading 2023-01-18 13:07:05 -05:00
Victor Hall 3e803a8313 deprecate ed1_mode, autodetect 2023-01-17 12:44:18 -05:00
Victor Hall ed025d27b6 enable xformers for sd1 models if amp enabled 2023-01-16 19:11:41 -05:00
Victor Hall 879f5bf33d amp mode work 2023-01-16 15:48:06 -05:00
Victor Hall 6ea55a1057 default values for new rated dataset stuff if missing in json 2023-01-15 23:08:49 -05:00
Victor Hall ba25992140 merge 2023-01-15 22:07:37 -05:00
Jan Gerritsen 711e263e24 Implemented system to train on a subset of the dataset, favouring higher rated images 2023-01-14 16:18:22 +01:00
Damian Stewart 4e6d8f1157 Merge remote-tracking branch 'upstream/main' into hf_model_download 2023-01-13 20:38:51 +01:00
Victor Hall 4209174360 issue with conv 2023-01-12 20:59:04 -05:00