Commit Graph

98 Commits

Author SHA1 Message Date
Victor Hall 3697b16344 put exif transpose back in preloading to fix bug 2023-04-16 02:19:27 -04:00
tyler b8f637873f removing the convert_rgb flag for a simpler design 2023-04-14 20:21:48 -05:00
tyler a839180199 set # of data loaders by the min of batch size or cpu count, do not do an rg b conversion when only loading image metadata 2023-04-14 14:59:28 -05:00
Augusto de la Torre 2bb35eaa0a Use filenames for caption if no main prompt in yaml 2023-04-14 00:59:26 +02:00
Augusto de la Torre dd98ebe080 Add error handling for bad exif, and rotate before sizing 2023-03-28 19:19:08 +02:00
Victor Hall 56256ab9ef attempt to catch some errors reported on github 2023-03-26 11:52:49 -04:00
Victor Hall 35d52b56e0 added some resolutions, option for val-loss pos-neg, fix wandb 2023-03-25 20:09:06 -04:00
Augusto de la Torre cdafa2dc43 Avoid unnecessary iteration over files in folder 2023-03-21 12:09:52 +01:00
Augusto de la Torre 161e0a563c Prioritize tags, `image > local > global`, but respect weights 2023-03-21 00:15:53 +01:00
Augusto de la Torre fae0b3c535 Retain original tag order when parsing captions 2023-03-19 23:30:42 +01:00
Victor Hall 5afd75fd98 try-except around preloading items to help troubleshoot issues 2023-03-18 22:27:09 -04:00
Victor Hall f01a7354f0 remove sort from dataset due to slowdown on large sets, add contribution readme 2023-03-18 22:24:03 -04:00
Augusto de la Torre 48f132554c Assign default value for MAX_CAPTION_LENGTH 2023-03-15 21:52:32 +01:00
Victor Hall ba687de8b4 add pbar back to preloading, remove cruft from testing loss stuff 2023-03-15 12:06:29 -04:00
Victor Hall 605716a646 conf 2023-03-15 11:22:40 -04:00
Victor Hall da3c183cc5 autofix exif orientation 2023-03-15 11:12:59 -04:00
Victor Hall 493afc3f20
Merge pull request #106 from qslug/enhanced-config
Add support for enhanced dataset configuration
2023-03-14 01:28:54 -04:00
Augusto de la Torre 7e20a74586 Fix cond_dropout and rating handling 2023-03-13 00:36:59 +01:00
Augusto de la Torre dea8c6e862 Refactor dataset to support multiple main_prompts
...rather than multiple whole captions
2023-03-12 19:48:15 +01:00
Damian Stewart f0111a6e2b make validation more comparable across runs 2023-03-09 08:21:59 +01:00
Augusto de la Torre 0716c40ab6 Add support for enhanced dataset configuration
Add support for:
* flip_p.txt
* cond_dropout.txt
* local.yaml consolidated config (including default captions)
* global.yaml consolidated config which applies recursively to subfolders
* flip_p, and cond_dropout config per image
* manifest.json with full image-level configuration
2023-03-08 15:02:14 +01:00
Victor Hall 2e3d044ba3 Merge branch 'main' of https://github.com/victorchall/EveryDream2trainer 2023-03-04 15:09:31 -05:00
Victor Hall 87ffb5413e log warning when val/loss is showing upward trend 2023-03-04 15:08:47 -05:00
Damian Stewart e2fd45737d overwrite args.seed with the actual seed if -1 is passed (so it appears in tensorboard)
also improve logging when unet training is disabled
2023-03-02 22:52:33 +01:00
damian 61558be2ae logging and progress bar improvements 2023-03-02 18:29:28 +01:00
Victor Hall 8abef6bc74 revert multiline txt for now due to bug 2023-02-28 21:14:19 -05:00
Victor Hall 6c87af7711 doc, fix lr defaulting bug, notdreambooth.md, read multicaption from .txt by line 2023-02-26 19:11:42 -05:00
Victor Hall e7fc71ffa1
Merge pull request #92 from victorchall/lion
fix mem leak on huge data, rework optimizer to separate json, add lio…
2023-02-25 16:20:10 -05:00
Victor Hall a9b0189947 fix mem leak on huge data, rework optimizer to separate json, add lion optimizer 2023-02-25 15:05:22 -05:00
Damian Stewart be8cdf8af8 actually use validation seed 😭 also fix log of sample negative prompt 2023-02-20 21:58:46 +01:00
Damian Stewart 7956e24b61 fix division by zero 2023-02-19 10:25:38 +01:00
Victor Hall 8a8a4cf3df make val optional, revert multiply algo 2023-02-08 13:04:12 -05:00
Damian Stewart 19347bcaa8 make fractional multiplier logic apply per-directory 2023-02-08 14:15:54 +01:00
Damian Stewart a7b00e9ef3 fix multiplier logic 2023-02-08 13:46:58 +01:00
Damian Stewart 4e37200dda fix multiplier issues with validation and refactor validation logic 2023-02-08 11:28:45 +01:00
damian bca1e6e594 consistent spelling 2023-02-07 18:21:05 +01:00
damian e2d9600e34 cleaner config handling 2023-02-07 18:18:21 +01:00
damian dad9e347ff log ed batch name on creation 2023-02-07 18:08:19 +01:00
damian f0d7310c12 clarify init function names 2023-02-07 17:54:00 +01:00
damian c3d844a1bc better config handling 2023-02-07 17:52:23 +01:00
damian 29396ec21b update EveryDreamValidator for noprompt's changes 2023-02-07 17:46:30 +01:00
Joel Holdbrooks 41c9f36ed7 GH-36: Add support for validation split (WIP)
Co-authored-by: Damian Stewart <office@damianstewart.com>
2023-02-06 22:10:34 -08:00
Victor Hall 85f19b9a2f doc and bug with undersized 2023-02-06 13:11:24 -05:00
Joel Holdbrooks 56f130c027 Forgot to add train.py earlier 🤦; move write_batch_schedule to train.py 2023-01-29 18:11:34 -08:00
Joel Holdbrooks 12a0cb6286 Update documentation 2023-01-29 17:58:42 -08:00
Joel Holdbrooks 3fe335f328 Update documentation 2023-01-29 17:47:10 -08:00
Joel Holdbrooks c0ec46c030 Don't need to set data loader singleton; formatting tweaks 2023-01-29 17:31:57 -08:00
Joel Holdbrooks 326d861a86 Push DLMA into main, pass config to resolve
This patch

* passes the configuration (`argparse.Namespace`) to the resolver,
* pushes the DLMA code into the main function,
* makes DLMA take a `list[ImageTrainItem]` instead of `data_root`,
* makes `EveryDreamBatch` take `DLMA` instead of `data_root`, etc.
* allows `data_root` to be a list.

By doing these things, both `EveryDreamBatch` and DLMA can be free from
data resolution logic. It also reduces the number of arguments which
need to be passed down to EDB and DLMA.
2023-01-29 17:08:54 -08:00
Victor Hall 9639237762 minor fix to multiply.txt stuff and undersized images txt output new line 2023-01-27 13:58:14 -05:00
Joel Holdbrooks 94e9abf184 Don't bind resolver 2023-01-24 08:36:40 -08:00