Commit Graph

797 Commits

Author SHA1 Message Date
Damian Stewart 86aaf1c4d7
fix big when loss_scale.txt contains 0 2023-11-01 19:00:08 +01:00
Damian Stewart c485d4ea60
fix device mismatch with loss_scale 2023-11-01 09:29:41 +01:00
Damian Stewart a7343ad190 fix scale batch calculation 2023-11-01 08:11:42 +01:00
Damian Stewart da731268b2 put a file loss_scale.txt containing a float in a training folder to apply loss scale (eg -1 for negative examples) 2023-10-31 10:06:21 +01:00
Damian Stewart bc1058a0d5
Merge branch 'victorchall:main' into feat_add_sde_samplers 2023-10-23 00:04:03 +02:00
Damian Stewart 380b2a54ca cleanup logging 2023-10-22 23:00:54 +02:00
Damian Stewart 69158fa51b fix distribution logic 2023-10-22 22:02:38 +02:00
Damian Stewart 20ce83d3e1 log curve 2023-10-22 21:41:20 +02:00
Damian Stewart 9a69ce84cb typo 2023-10-22 20:23:57 +02:00
Damian Stewart 6434844432 add missing json, fix error 2023-10-22 20:09:53 +02:00
Damian Stewart 9396d2156e Merge remote-tracking branch 'upstream/main' into feat_the_acculmunator 2023-10-22 19:35:32 +02:00
Damian Stewart 26a1475f0c initial implementation of the_acculmunator 2023-10-22 19:26:35 +02:00
Victor Hall 2f2dd4c1f2
Merge pull request #229 from luisgabrielroldan/fix-undersize-warning
Fix Inaccuracies in Image Size Validation and Warning Display
2023-10-07 15:25:25 -04:00
Victor Hall 387616083c
Merge pull request #228 from luisgabrielroldan/fix-save-ckpt-dir
Fix save_ckpt_dir not being honored when saving model
2023-10-07 14:34:52 -04:00
Victor Hall 81807b6cab
Merge pull request #227 from timlawrenz/patch-1
Update CAPTION.md
2023-10-07 14:32:16 -04:00
Gabriel Roldán 60e10867bc Fix undersize warning 2023-10-05 01:48:09 -03:00
Gabriel Roldán f301677881 Fix save_ckpt_dir not being when saving model 2023-10-01 00:09:24 -03:00
Tim Lawrenz 8f9d23cf3b
Update CAPTION.md
minor typo
2023-09-27 08:17:01 -04:00
Victor Hall e8e4f0c2ea
Merge pull request #214 from luisgabrielroldan/keep_tags
Add --keep_tags to keep first N tags fixed on shuffle
2023-09-25 13:10:21 -04:00
Victor Hall 85b5bcc6e4
Update Train_Colab.ipynb 2023-09-25 12:34:13 -04:00
Victor Hall 3a2485eac5
fix save_every_n_epoch bug in colab 2023-09-25 12:15:29 -04:00
Victor Hall 8064c29066
add safetensors to colab req 2023-09-25 12:01:35 -04:00
Victor Hall 9e820c8cce doc for lion8bit and prodigy 2023-09-22 16:02:22 -04:00
Victor Hall 78a4c13c4c add prodigyopt requirements 2023-09-22 13:46:25 -04:00
Victor Hall 5444abfd7b finalize lion8bit and add prodigy 2023-09-22 12:15:32 -04:00
Victor Hall a9c98f5866 bugged flag 2023-09-22 10:16:56 -04:00
Victor Hall 166c2e74e1 off by one on last epoch save 2023-09-21 21:29:36 -04:00
Victor Hall ada6037463 add bnb lion8bit support 2023-09-21 13:48:40 -04:00
Victor Hall a47d65799f early work on pinned image tensor 2023-09-21 13:48:40 -04:00
Gabriel Roldan f2d5a40f72
Add --keep_tags docs and fix some typos 2023-09-20 19:57:04 -03:00
Gabriel Roldan 99a0431d0f
Ignore negative keep_tags values 2023-09-20 19:53:31 -03:00
Gabriel Roldán 43984f2ad3
Add --keep_tags to keep first N tags fixed on shuffle 2023-09-20 19:53:30 -03:00
Victor Hall 6c8d15daab
Merge pull request #209 from damian0815/feat_rolling_save
Feature: Rolling save ckpt
2023-09-20 16:32:48 -04:00
Victor Hall 09aa13c3dd
Merge branch 'main' into feat_rolling_save 2023-09-20 16:32:37 -04:00
Victor Hall 2dff3aa8d1 ema update 2023-09-18 16:13:22 -04:00
Victor Hall 303c8312e3 update ema sample args again 2023-09-18 16:12:51 -04:00
Victor Hall 29bab698a3 minor update to ema docs 2023-09-18 15:07:39 -04:00
Victor Hall 2f52832209 fix trainSD21.json and advanced tweaking ema param names 2023-09-18 14:54:43 -04:00
Victor Hall e4d93225f7 fix train21.json ema param names 2023-09-18 14:53:17 -04:00
Damian Stewart a68ebe3658 fix typo 2023-09-17 20:17:54 +02:00
Damian Stewart 3579bec540 cleanup 2023-09-17 20:17:33 +02:00
Damian Stewart 74c21602fc default every epoch not 3x per epoch 2023-09-17 20:16:29 +02:00
Damian Stewart 3fddef3698 put back make_save_path and fix error in plugin runner 2023-09-17 20:16:26 +02:00
Victor Hall fa5b38e26b some minor updates to ema 2023-09-12 21:37:27 -04:00
Victor Hall 49de395df1
Merge pull request #224 from a-l-e-x-d-s-9/main
EMA decay, min-SNR-gamma, settings loading every epoch, zero terminal SNR separation from ZFN.
2023-09-12 15:36:27 -04:00
alexds9 7259ce873b 1. Samples format change to make sure global step appear before "ema" indication. 2023-09-11 00:13:26 +03:00
alexds9 39b3082bf4 1. Making sure to release VRAM in samples. 2023-09-10 22:42:01 +03:00
alexds9 d2d493c911 1. New parameters added to train.json and trainSD21.json - disabled by default.
2. Description added to ADVANCED_TWEAKING.md
2023-09-10 20:06:50 +03:00
alexds9 5b1760fff2 1. Added an argument ema_decay_resume_model to load EMA model - it's loaded alongside main model, instead of copying normal model. It's optional, without loaded EMA model, it will copy the regular model to me the first EMA model, just like before.
2. Fixed findlast option for regular models not to load EMA models by default.
3. findlast can be used to load EMA model too when used with ema_decay_resume_model.
4. Added ema_device variable to store the device in torch type.
5. Cleaned prints and comments.
2023-09-07 19:53:20 +03:00
alexds9 cf4a082e11 1. Fix to EMA samples arguments not respecting False value. 2023-09-06 23:04:12 +03:00