Commit Graph

  • 6ea721887c allow scheduler change for training Victor Hall 2023-11-05 21:14:54 -0500
  • 21361a3622 option to swap training scheduler Victor Hall 2023-11-05 20:54:09 -0500
  • 4fae89fdee docker work Victor Hall 2023-11-05 20:53:21 -0500
  • c0a1955164
    Merge pull request #233 from damian0815/feat_the_acculmunator Victor Hall 2023-11-05 19:53:11 -0500
  • 3134fa1964
    Merge pull request #236 from erichocean/erichocean-patch-1 Victor Hall 2023-11-05 19:52:06 -0500
  • 9692031c27 work on docker Victor Hall 2023-11-05 19:33:44 -0500
  • d999c2fc1c
    Update CLOUD_SETUP.md #236 Erich Ocean 2023-11-05 18:49:12 -0500
  • 45f41289db fix xf ver in dockerfile Victor Hall 2023-11-04 09:29:05 -0400
  • 2bda841b2f
    Merge pull request #232 from damian0815/feat_add_sde_samplers Victor Hall 2023-11-03 17:37:25 -0400
  • 848b6eec71 fix balancing doc Victor Hall 2023-11-03 13:51:35 -0400
  • 2512981956 remove errant print from kosmos script Victor Hall 2023-11-03 13:50:47 -0400
  • 4c3de29d43 kosmos update and docs for loss_scale.txt Victor Hall 2023-11-03 13:50:11 -0400
  • 30b063dfec
    Merge pull request #235 from damian0815/feat_negative_loss Victor Hall 2023-11-03 13:36:36 -0400
  • 164f635c6f dtype added for kosmos-2 captions Victor Hall 2023-11-02 23:24:00 -0400
  • 24a692801a update kosmos2 caption to use cuda by default Victor Hall 2023-11-02 22:53:10 -0400
  • 4fb64fed66 updating reqs Victor Hall 2023-11-02 21:54:29 -0400
  • 3150f7d299 add kosmo-2 caption script and update diffusers and transformers to latest Victor Hall 2023-11-02 21:47:50 -0400
  • 86aaf1c4d7
    fix big when loss_scale.txt contains 0 #235 Damian Stewart 2023-11-01 19:00:08 +0100
  • c485d4ea60
    fix device mismatch with loss_scale Damian Stewart 2023-11-01 09:29:41 +0100
  • a7343ad190 fix scale batch calculation Damian Stewart 2023-11-01 08:11:42 +0100
  • da731268b2 put a file loss_scale.txt containing a float in a training folder to apply loss scale (eg -1 for negative examples) Damian Stewart 2023-10-31 10:06:21 +0100
  • bc1058a0d5
    Merge branch 'victorchall:main' into feat_add_sde_samplers #232 Damian Stewart 2023-10-23 00:04:03 +0200
  • 380b2a54ca cleanup logging #233 Damian Stewart 2023-10-22 23:00:54 +0200
  • 69158fa51b fix distribution logic Damian Stewart 2023-10-22 22:02:38 +0200
  • 20ce83d3e1 log curve Damian Stewart 2023-10-22 21:41:20 +0200
  • 9a69ce84cb typo Damian Stewart 2023-10-22 20:23:57 +0200
  • 6434844432 add missing json, fix error Damian Stewart 2023-10-22 20:09:53 +0200
  • 9396d2156e Merge remote-tracking branch 'upstream/main' into feat_the_acculmunator Damian Stewart 2023-10-22 19:35:32 +0200
  • 26a1475f0c initial implementation of the_acculmunator Damian Stewart 2023-10-22 19:26:35 +0200
  • 7a46eee3b2 new global attempt 2 Damian Stewart 2023-10-22 10:47:07 +0200
  • 85f3e05012 new global attempt Damian Stewart 2023-10-22 10:36:36 +0200
  • f30346497f fix base class Damian Stewart 2023-10-22 09:59:55 +0200
  • a5bdb499d9 fix global access Damian Stewart 2023-10-22 09:45:08 +0200
  • df074e84ec add embedding save and more performant token resetting Damian Stewart 2023-10-21 21:45:15 +0200
  • dba6cdd98e more memory-efficient model part disabling Damian Stewart 2023-10-15 15:57:13 +0200
  • 4f69e4d835 make it work Damian Stewart 2023-10-15 15:02:04 +0200
  • d6f99145e3 better handle overwriting Damian Stewart 2023-10-15 11:23:03 +0200
  • ee151e8c9b textual inversion plugin first pass Damian Stewart 2023-10-15 11:10:55 +0200
  • 2f2dd4c1f2
    Merge pull request #229 from luisgabrielroldan/fix-undersize-warning Victor Hall 2023-10-07 15:25:25 -0400
  • 12d1a3fca7
    Move txt-file presence check up as break condition #230 Tim Lawrenz 2023-10-07 15:13:59 -0400
  • 387616083c
    Merge pull request #228 from luisgabrielroldan/fix-save-ckpt-dir Victor Hall 2023-10-07 14:34:52 -0400
  • 81807b6cab
    Merge pull request #227 from timlawrenz/patch-1 Victor Hall 2023-10-07 14:32:16 -0400
  • 60e10867bc Fix undersize warning #229 Gabriel Roldán 2023-10-05 01:27:47 -0300
  • f301677881 Fix save_ckpt_dir not being when saving model #228 Gabriel Roldán 2023-10-01 00:09:24 -0300
  • 8f9d23cf3b
    Update CAPTION.md #227 Tim Lawrenz 2023-09-27 08:17:01 -0400
  • e8e4f0c2ea
    Merge pull request #214 from luisgabrielroldan/keep_tags Victor Hall 2023-09-25 13:10:21 -0400
  • 85b5bcc6e4
    Update Train_Colab.ipynb Victor Hall 2023-09-25 12:34:13 -0400
  • 3a2485eac5
    fix save_every_n_epoch bug in colab Victor Hall 2023-09-25 12:15:29 -0400
  • 8064c29066
    add safetensors to colab req Victor Hall 2023-09-25 12:01:35 -0400
  • 9e820c8cce doc for lion8bit and prodigy Victor Hall 2023-09-22 16:02:22 -0400
  • 78a4c13c4c add prodigyopt requirements Victor Hall 2023-09-22 13:46:25 -0400
  • 5444abfd7b finalize lion8bit and add prodigy Victor Hall 2023-09-22 12:15:32 -0400
  • a9c98f5866 bugged flag Victor Hall 2023-09-22 10:16:56 -0400
  • 166c2e74e1 off by one on last epoch save Victor Hall 2023-09-21 21:29:36 -0400
  • f080c9f5dc fix issues from rebase #234 Damian Stewart 2023-09-21 20:37:06 +0200
  • 9b046698d4 typo Damian Stewart 2023-08-14 15:07:31 +0200
  • 0b03f4d150 add --input_perturbation arg Damian Stewart 2023-08-13 20:11:10 +0200
  • ada6037463 add bnb lion8bit support Victor Hall 2023-09-21 13:47:26 -0400
  • a47d65799f early work on pinned image tensor Victor Hall 2023-09-21 13:44:36 -0400
  • f2d5a40f72
    Add --keep_tags docs and fix some typos #214 Gabriel Roldan 2023-09-20 19:51:14 -0300
  • 99a0431d0f
    Ignore negative keep_tags values Gabriel Roldan 2023-09-20 19:50:34 -0300
  • 43984f2ad3
    Add --keep_tags to keep first N tags fixed on shuffle Gabriel Roldán 2023-07-17 01:33:52 -0300
  • 6c8d15daab
    Merge pull request #209 from damian0815/feat_rolling_save Victor Hall 2023-09-20 16:32:48 -0400
  • 09aa13c3dd
    Merge branch 'main' into feat_rolling_save #209 Victor Hall 2023-09-20 16:32:37 -0400
  • 2dff3aa8d1 ema update Victor Hall 2023-09-18 16:13:22 -0400
  • 303c8312e3 update ema sample args again Victor Hall 2023-09-18 16:12:51 -0400
  • 29bab698a3 minor update to ema docs Victor Hall 2023-09-18 15:07:39 -0400
  • 2f52832209 fix trainSD21.json and advanced tweaking ema param names Victor Hall 2023-09-18 14:54:43 -0400
  • e4d93225f7 fix train21.json ema param names Victor Hall 2023-09-18 14:53:17 -0400
  • a68ebe3658 fix typo Damian Stewart 2023-09-17 20:17:54 +0200
  • 3579bec540 cleanup Damian Stewart 2023-09-10 21:42:29 +0200
  • 74c21602fc default every epoch not 3x per epoch Damian Stewart 2023-09-10 21:41:09 +0200
  • 3fddef3698 put back make_save_path and fix error in plugin runner Damian Stewart 2023-09-10 21:37:47 +0200
  • 3f75115a3f
    Update Train_Colab.ipynb - Added missing python packages #226 Charles-Philippe Bernard 2023-09-14 00:03:48 +0100
  • fa5b38e26b some minor updates to ema Victor Hall 2023-09-12 21:37:27 -0400
  • 49de395df1
    Merge pull request #224 from a-l-e-x-d-s-9/main Victor Hall 2023-09-12 15:36:27 -0400
  • 7259ce873b 1. Samples format change to make sure global step appear before "ema" indication. #224 alexds9 2023-09-11 00:13:26 +0300
  • 39b3082bf4 1. Making sure to release VRAM in samples. alexds9 2023-09-10 22:42:01 +0300
  • d2d493c911 1. New parameters added to train.json and trainSD21.json - disabled by default. 2. Description added to ADVANCED_TWEAKING.md alexds9 2023-09-10 20:06:50 +0300
  • 5b1760fff2 1. Added an argument ema_decay_resume_model to load EMA model - it's loaded alongside main model, instead of copying normal model. It's optional, without loaded EMA model, it will copy the regular model to me the first EMA model, just like before. 2. Fixed findlast option for regular models not to load EMA models by default. 3. findlast can be used to load EMA model too when used with ema_decay_resume_model. 4. Added ema_device variable to store the device in torch type. 5. Cleaned prints and comments. alexds9 2023-09-07 19:53:20 +0300
  • cf4a082e11 1. Fix to EMA samples arguments not respecting False value. alexds9 2023-09-06 23:04:12 +0300
  • 5bcf9407f0 1. Improved EMA support: samples generation with arguments EMA/NOT-EMA, saving checkpoints and diffusers for both, ema_decay_target implemented. 2. enable_zero_terminal_snr separated from zero_frequency_noise_ratio. alexds9 2023-09-06 22:37:10 +0300
  • 23df727a1f Added support for: 1. EMA decay. Using EMA decay model, it is updated every ema_decay_interval by (1 - ema_decay_rate), it can be stored on CPU to save VRAM. Only EMA model is saved now. 2. min_snr_gamma - improve converging speed, more info: https://arxiv.org/abs/2303.09556 3. load_settings_every_epoch - Will load 'train.json' at start of every epoch. alexds9 2023-09-06 13:38:52 +0300
  • a7f94e96e0
    set default resume_ckpt to sd1.5 Victor Hall 2023-09-05 18:04:05 -0400
  • affa4f3579
    Merge 0c7ddb3c7f into 3954182e45 #212 Shawn 2023-09-04 08:04:32 +0500
  • 3954182e45
    Merge pull request #220 from damian0815/fix_ztsnr_samples_and_save Victor Hall 2023-09-03 17:57:27 -0400
  • d493d16504 fix conv Victor Hall 2023-09-03 16:03:45 -0400
  • d6ef90bafc
    add scipy to docker Victor Hall 2023-09-03 12:17:11 -0400
  • 1bd8fa1881
    Update windows_setup.cmd to bnb 41.1 Victor Hall 2023-09-02 11:18:12 -0400
  • 2000ec178d
    Update requirements.txt Victor Hall 2023-08-31 12:46:46 -0400
  • a33a62b786
    Update requirements-runtime.txt Victor Hall 2023-08-31 12:46:19 -0400
  • 9b5b96a50b fixes for ZTSNR training #220 Damian Stewart 2023-08-15 20:47:34 +0200
  • 4b824f3acd add samplers Damian Stewart 2023-08-10 20:47:28 +0200
  • 24b17efb08
    Update optimizer_dadapt.json Victor Hall 2023-08-04 22:13:52 -0400
  • d196865f4c
    Update optimizer_dadapt.json Victor Hall 2023-08-04 22:13:21 -0400
  • ab563a50e0
    Update OPTIMIZER.md Victor Hall 2023-07-23 02:16:26 -0400
  • a87b4ff92e
    Update CHAINING.md Victor Hall 2023-07-16 18:10:34 -0400
  • 28e3d22aff
    Update ADVANCED_TWEAKING.md Victor Hall 2023-07-16 17:57:17 -0400
  • 0c7ddb3c7f Created using Colaboratory #212 Shawn 2023-07-16 11:09:12 -0500
  • a6032bee35
    Update CAPTION.md Victor Hall 2023-07-11 15:46:41 -0400