Commit Graph

590 Commits

Author SHA1 Message Date
Damian Stewart a8455b9427 wip first fit decreasing for batch/grad accum shuffling 2023-06-07 13:49:06 +02:00
Damian Stewart ba95b8c6d1 simplify runt handling 2023-06-05 01:04:21 +02:00
Damian Stewart 53d0686086 add a batch_id.txt file to subfolders or a batch_id key to local yaml to force images for that folder to be processed in the same batch 2023-06-05 01:02:27 +02:00
Pat Shanahan e7d199e712
Fix spelling error 2023-06-04 10:13:37 -05:00
Victor Hall 7e09b6dc29
Merge pull request #188 from nawnie/main
Dep update
2023-06-04 00:15:15 -04:00
Victor Hall f2ddc81f63
Merge branch 'main' into main 2023-06-04 00:14:56 -04:00
Victor Hall 9970ccc3fb
Update Train_Colab.ipynb
fix bad variable in colab notebook
2023-06-04 00:10:45 -04:00
Victor Hall 1155a28867 zero terminal fixes 2023-06-03 21:41:56 -04:00
Victor Hall 81b7b00df7 use trained betas for zero terminal snr 2023-06-03 17:17:04 -04:00
Victor Hall d7e12a78b0 add dadapt to win setup 2023-06-03 11:27:38 -04:00
Victor Hall 348f1d7f02 some more optimizer stuff 2023-06-03 11:27:17 -04:00
Victor Hall a96c6e2166 dadapt stuff 2023-06-03 11:26:53 -04:00
Shawn b8015bef2d Dep update 2023-06-03 02:20:40 -05:00
Victor Hall 9ee2effacd add 320 res 2023-06-01 21:19:20 -04:00
Victor Hall 97f1160496 correct image cropping 2023-06-01 21:06:55 -04:00
Victor Hall c0e7c4adf9 errant print 2023-06-01 19:23:43 -04:00
Victor Hall 615fa929e5 errant print 2023-06-01 19:17:43 -04:00
Victor Hall 0e0b546ef7 help debug crop error someone got 2023-06-01 19:11:22 -04:00
Victor Hall 56deb26a59 early work on shuffle_tags.txt and add try around trimming 2023-06-01 16:18:21 -04:00
MFAlex 7dcfa7acbf Speed up preloading by not loading pixel data
PIL lazy loads pixel data, so image size can be accessed without loading the full image.
This commit makes it so only image size and EXIF metadata are fetched from disk, speeding up the preload stage while still supporting transposing images.
2023-05-31 20:48:48 -04:00
Victor Hall 5c98cdee70 update to torch2, xformers 20, bnb 381 2023-05-30 22:15:02 -04:00
Kelvie Wong 249065f127 Add docker compose file
This makes it a lot easier to use locally on docker, and a lot easier to
move files in/out, as well as save notebooks.
2023-05-28 12:42:57 -07:00
Victor Hall 803cadfd53 missed checkin 2023-05-27 01:03:01 -04:00
Victor Hall f59b882376 Merge branch 'main' of https://github.com/victorchall/EveryDream2trainer 2023-05-26 21:54:12 -04:00
Victor Hall e6a176ea28 new bnb and beta zero terminal snr 2023-05-26 21:54:02 -04:00
Shawn 8fe41e460e xformers 20, and ui update 2023-05-24 02:24:38 -05:00
Damian Stewart 1939cd52b7 apply clip_grad_scale to the unscaled gradients just before stepping the scaler 2023-05-22 23:58:22 -04:00
Victor Hall 562c434113 add banner jpg 2023-05-21 01:10:14 -04:00
Victor Hall 48e31ae58f remove high mem warning for colab 2023-05-20 21:47:00 -04:00
Shawn 927944a5ca Created using Colaboratory 2023-05-20 21:35:24 -04:00
Shawn 6a4e32f7cf Created using Colaboratory 2023-05-20 21:35:24 -04:00
Shawn d515908ed6 Created using Colaboratory 2023-05-20 21:35:24 -04:00
Victor Hall cdc0b39584
Update OPTIMIZER.md 2023-05-17 14:54:16 -04:00
Damian Stewart a0fca99972 docs typo 2023-05-16 13:57:05 -04:00
Damian Stewart 1b432d8819 docs for TE freezing 2023-05-16 13:57:05 -04:00
Damian Stewart 85ad289296 update SD2.1 default training settings 2023-05-16 13:57:05 -04:00
Damian Stewart a6610625eb Squashed commit of the following:
commit 0f890f2d6bbccee225f738934f4c4450323f19a2
Merge: c008c40 003b089
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 14 11:47:40 2023 +0200

    Merge remote-tracking branch 'upstream/main' into feat_te_last_n_layers_unsquashed

commit c008c404f19ebc6b78085f42a4e39aeb2ba00d04
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 14 11:23:20 2023 +0200

    finalize TE layer freezing

commit 7377b10d59e32a6fea5d321a598ae4504e1a9f36
Author: Damian Stewart <d@damianstewart.com>
Date:   Thu May 11 20:45:28 2023 +0200

    remove zero_lr method

commit 4af13ba816c2811d7b5bd6fbb81a32bca6747e99
Author: Damian Stewart <d@damianstewart.com>
Date:   Thu May 11 20:05:01 2023 +0200

    Revert "rename parameters"

    This reverts commit aa33c61337599ab2d90b34aaf8c3d36fd4edf147.

commit aa33c61337599ab2d90b34aaf8c3d36fd4edf147
Author: Damian Stewart <d@damianstewart.com>
Date:   Tue May 9 00:28:00 2023 +0200

    rename parameters

commit 1da867e6fadb873da2571371a73b522406d76a18
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 22:28:29 2023 +0200

    remove silly check

commit 483cb2a635c3fe5a044edf4ea8de095bedc3f0ac
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 20:53:43 2023 +0200

    use 1e-10 not 0 as 'zero' lr

commit e5d230e6c765a7e25dc6381d09bd0a66a9a54ec2
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 20:51:51 2023 +0200

    add experimental 'zero_lr' freeze method

commit bcf24ee59a443c0ee71d622e65e1043b547f845e
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 17:32:11 2023 +0200

    fix layer selection bug

commit 7ee33eff8740e095f85042dcbb792e025b179c6c
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 17:25:25 2023 +0200

    put back the 'drop' method and make accessible

commit 76dfbf6dd6f43f3aa9a7f4629baa8e86573d9520
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 16:39:05 2023 +0200

    wip getting final_layer_norm to work

commit a19d43651a87525251106ed57238cd2cd1c3f3ff
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 16:15:53 2023 +0200

    work around a crash when freeze_final_layer_norm is True

commit c2a44eb25132941b92e2ecd0be3682ae3c6838c2
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 15:47:10 2023 +0200

    improve logging, add extra freezing controls

commit a31e64c4c0d12dfb6583dd6f22c8c09ba7840410
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 13:46:38 2023 +0200

    alternative method to freeze early TE layers

commit 095692fd4ea53707c012217898321860d8b9329f
Merge: 876072c 4c5ce81
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 11:52:51 2023 +0200

    Merge branch 'victorchall:main' into feat_te_last_n_layers

commit 876072c46394fde721a6026f7a6ef72ccb150ddb
Author: Damian Stewart <d@damianstewart.com>
Date:   Sun May 7 01:41:50 2023 +0200

    implement last N layers training only for TE
2023-05-16 13:57:05 -04:00
Colai 4a2e0bebdd fix typo
example instead of exmaple
2023-05-16 13:54:42 -04:00
Damian Stewart e14973a9da fix restoring optimizer state typo 2023-05-14 12:27:24 -04:00
Victor Hall 003b08992b remove old print 2023-05-11 19:17:19 -04:00
Victor Hall 9f2edb499d Merge branch 'main' of https://github.com/victorchall/EveryDream2trainer 2023-05-08 18:25:43 -04:00
Victor Hall 6b8d66ee83 test reqs 2023-05-08 18:25:37 -04:00
Augusto de la Torre 98037d6006 Update LD_LIBRARY_PATH env on startup
Export env variables explicitly for Vast.ai
2023-05-08 15:16:19 -04:00
Victor Hall 6c562d5e78
Merge pull request #161 from damian0815/val_partial_epochs
Feature: enable partial epoch support for validation
2023-05-08 15:15:11 -04:00
Damian Stewart 8c05b7e1d5 update docs for every_n_epochs 2023-05-07 12:05:49 +02:00
Damian Stewart f3468fe7e7 Merge remote-tracking branch 'upstream/main' into val_partial_epochs 2023-05-07 02:11:58 +02:00
Victor Hall 4c5ce81b31
Update OPTIMIZER.md 2023-05-06 00:54:47 -04:00
Victor Hall cb511025ed finalizing optimizer split 2023-05-04 20:15:36 -04:00
Victor Hall 4e81a0eb55 optimizer split works 2023-05-04 20:15:36 -04:00
Victor Hall 970065c206 more wip optimizer splitting 2023-05-04 20:15:36 -04:00