Commit Graph

34 Commits

Author SHA1 Message Date
alexds9 32ee7cb7f0 Using -1.0 instead of -1. 2023-11-17 21:55:12 +02:00
alexds9 fc6a4a7c27 LR removed from AdaCoor. 2023-11-17 20:52:15 +02:00
alexds9 5dc9f18061 1. Added AdaCoor optimizer.
2. Added pyramid noise.
3. Fixed problem with log_writer missing from EveryDreamOptimizer.
2023-11-17 16:08:43 +02:00
Victor Hall 840493037e gg bug from pr that thought it was simplifying code but broke it 2023-11-15 16:19:08 -05:00
Victor Hall 097d864ef5 fixing bugs 2023-11-15 15:57:09 -05:00
Victor Hall 20a9b3254f bug in optimizer decay-warmup defaulting 2023-11-15 15:46:28 -05:00
reijerh 7de666ec2d Misc minor fixes 2023-11-09 00:22:41 +01:00
Victor Hall 6ea721887c allow scheduler change for training 2023-11-05 21:14:54 -05:00
Victor Hall 4fb64fed66 updating reqs 2023-11-02 21:54:33 -04:00
Victor Hall 5444abfd7b finalize lion8bit and add prodigy 2023-09-22 12:15:32 -04:00
Victor Hall ada6037463 add bnb lion8bit support 2023-09-21 13:48:40 -04:00
Victor Hall aa7e004869 first crack at plugins 2023-06-27 20:53:48 -04:00
Victor Hall 3bf95d4edc add dadapt_adan and fix bug in decay steps for te 2023-06-24 14:41:16 -04:00
Victor Hall a916934bb8
Merge branch 'main' into fix_simplify_freezing_text_encoder_layers 2023-06-18 00:53:51 -04:00
Damian Stewart 227f56427b write correct epoch number of final save, and add flags to disable grad scaler tweaks, last epoch renaming, ckpt save 2023-06-17 19:18:04 +02:00
Damian Stewart eca442f7fb simplify freezing text encoder layers config 2023-06-17 18:54:06 +02:00
SargeZT 2045bcc7c8 i'm a dumbass pass it on 2023-06-09 21:56:05 -05:00
SargeZT f9640d7c51 Re-added coordinate dowg with 0.3.1's release and fix 2023-06-09 21:40:17 -05:00
SargeZT e5b08f92b4 Disabled CoordinateDoWG for now because it's a bit unstable with SD training, don't need the troubleshooting requests 2023-06-08 11:36:20 -05:00
SargeZT 4861d96ec2 Added CoordinateDoWG and ScalarDoWG 2023-06-08 10:39:46 -05:00
Victor Hall 1155a28867 zero terminal fixes 2023-06-03 21:41:56 -04:00
Victor Hall 348f1d7f02 some more optimizer stuff 2023-06-03 11:27:17 -04:00
Victor Hall a96c6e2166 dadapt stuff 2023-06-03 11:26:53 -04:00
Victor Hall f59b882376 Merge branch 'main' of https://github.com/victorchall/EveryDream2trainer 2023-05-26 21:54:12 -04:00
Victor Hall e6a176ea28 new bnb and beta zero terminal snr 2023-05-26 21:54:02 -04:00
Damian Stewart 1939cd52b7 apply clip_grad_scale to the unscaled gradients just before stepping the scaler 2023-05-22 23:58:22 -04:00
Damian Stewart a6610625eb Squashed commit of the following:
commit 0f890f2d6bbccee225f738934f4c4450323f19a2
Merge: c008c40 003b089
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 14 11:47:40 2023 +0200

    Merge remote-tracking branch 'upstream/main' into feat_te_last_n_layers_unsquashed

commit c008c404f19ebc6b78085f42a4e39aeb2ba00d04
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 14 11:23:20 2023 +0200

    finalize TE layer freezing

commit 7377b10d59e32a6fea5d321a598ae4504e1a9f36
Author: Damian Stewart <d@damianstewart.com>
Date:   Thu May 11 20:45:28 2023 +0200

    remove zero_lr method

commit 4af13ba816c2811d7b5bd6fbb81a32bca6747e99
Author: Damian Stewart <d@damianstewart.com>
Date:   Thu May 11 20:05:01 2023 +0200

    Revert "rename parameters"

    This reverts commit aa33c61337599ab2d90b34aaf8c3d36fd4edf147.

commit aa33c61337599ab2d90b34aaf8c3d36fd4edf147
Author: Damian Stewart <d@damianstewart.com>
Date:   Tue May 9 00:28:00 2023 +0200

    rename parameters

commit 1da867e6fadb873da2571371a73b522406d76a18
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 22:28:29 2023 +0200

    remove silly check

commit 483cb2a635c3fe5a044edf4ea8de095bedc3f0ac
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 20:53:43 2023 +0200

    use 1e-10 not 0 as 'zero' lr

commit e5d230e6c765a7e25dc6381d09bd0a66a9a54ec2
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 20:51:51 2023 +0200

    add experimental 'zero_lr' freeze method

commit bcf24ee59a443c0ee71d622e65e1043b547f845e
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 17:32:11 2023 +0200

    fix layer selection bug

commit 7ee33eff8740e095f85042dcbb792e025b179c6c
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 17:25:25 2023 +0200

    put back the 'drop' method and make accessible

commit 76dfbf6dd6f43f3aa9a7f4629baa8e86573d9520
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 16:39:05 2023 +0200

    wip getting final_layer_norm to work

commit a19d43651a87525251106ed57238cd2cd1c3f3ff
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 16:15:53 2023 +0200

    work around a crash when freeze_final_layer_norm is True

commit c2a44eb25132941b92e2ecd0be3682ae3c6838c2
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 15:47:10 2023 +0200

    improve logging, add extra freezing controls

commit a31e64c4c0d12dfb6583dd6f22c8c09ba7840410
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 13:46:38 2023 +0200

    alternative method to freeze early TE layers

commit 095692fd4ea53707c012217898321860d8b9329f
Merge: 876072c 4c5ce81
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 11:52:51 2023 +0200

    Merge branch 'victorchall:main' into feat_te_last_n_layers

commit 876072c46394fde721a6026f7a6ef72ccb150ddb
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 01:41:50 2023 +0200

    implement last N layers training only for TE
2023-05-16 13:57:05 -04:00
Damian Stewart e14973a9da fix restoring optimizer state typo 2023-05-14 12:27:24 -04:00
Victor Hall 003b08992b remove old print 2023-05-11 19:17:19 -04:00
Victor Hall cb511025ed finalizing optimizer split 2023-05-04 20:15:36 -04:00
Victor Hall 4e81a0eb55 optimizer split works 2023-05-04 20:15:36 -04:00
Victor Hall 970065c206 more wip optimizer splitting 2023-05-04 20:15:36 -04:00
Victor Hall 72a47741f0 optimizer spltting 2023-05-04 20:15:36 -04:00
Victor Hall 3639e36135 refactor optimizer to split te and unet 2023-05-04 20:15:36 -04:00