Victor Hall
c0a1955164
Merge pull request #233 from damian0815/feat_the_acculmunator
...
plugin: grad accumulation scheduler
2023-11-05 19:53:11 -05:00
Damian Stewart
da731268b2
put a file loss_scale.txt containing a float in a training folder to apply loss scale (eg -1 for negative examples)
2023-10-31 10:06:21 +01:00
Damian Stewart
9396d2156e
Merge remote-tracking branch 'upstream/main' into feat_the_acculmunator
2023-10-22 19:35:32 +02:00
Damian Stewart
26a1475f0c
initial implementation of the_acculmunator
2023-10-22 19:26:35 +02:00
Victor Hall
e8e4f0c2ea
Merge pull request #214 from luisgabrielroldan/keep_tags
...
Add --keep_tags to keep first N tags fixed on shuffle
2023-09-25 13:10:21 -04:00
Victor Hall
a47d65799f
early work on pinned image tensor
2023-09-21 13:48:40 -04:00
Gabriel Roldán
43984f2ad3
Add --keep_tags to keep first N tags fixed on shuffle
2023-09-20 19:53:30 -03:00
Victor Hall
56deb26a59
early work on shuffle_tags.txt and add try around trimming
2023-06-01 16:18:21 -04:00
Victor Hall
5c98cdee70
update to torch2, xformers 20, bnb 381
2023-05-30 22:15:02 -04:00
tyler
a839180199
set # of data loaders by the min of batch size or cpu count, do not do an rg b conversion when only loading image metadata
2023-04-14 14:59:28 -05:00
Augusto de la Torre
7e20a74586
Fix cond_dropout and rating handling
2023-03-13 00:36:59 +01:00
Augusto de la Torre
0716c40ab6
Add support for enhanced dataset configuration
...
Add support for:
* flip_p.txt
* cond_dropout.txt
* local.yaml consolidated config (including default captions)
* global.yaml consolidated config which applies recursively to subfolders
* flip_p, and cond_dropout config per image
* manifest.json with full image-level configuration
2023-03-08 15:02:14 +01:00
Victor Hall
8abef6bc74
revert multiline txt for now due to bug
2023-02-28 21:14:19 -05:00
Victor Hall
6c87af7711
doc, fix lr defaulting bug, notdreambooth.md, read multicaption from .txt by line
2023-02-26 19:11:42 -05:00
Victor Hall
a9b0189947
fix mem leak on huge data, rework optimizer to separate json, add lion optimizer
2023-02-25 15:05:22 -05:00
Damian Stewart
4e37200dda
fix multiplier issues with validation and refactor validation logic
2023-02-08 11:28:45 +01:00
damian
dad9e347ff
log ed batch name on creation
2023-02-07 18:08:19 +01:00
damian
29396ec21b
update EveryDreamValidator for noprompt's changes
2023-02-07 17:46:30 +01:00
Joel Holdbrooks
41c9f36ed7
GH-36: Add support for validation split (WIP)
...
Co-authored-by: Damian Stewart <office@damianstewart.com>
2023-02-06 22:10:34 -08:00
Joel Holdbrooks
56f130c027
Forgot to add train.py earlier 🤦 ; move write_batch_schedule to train.py
2023-01-29 18:11:34 -08:00
Joel Holdbrooks
12a0cb6286
Update documentation
2023-01-29 17:58:42 -08:00
Joel Holdbrooks
c0ec46c030
Don't need to set data loader singleton; formatting tweaks
2023-01-29 17:31:57 -08:00
Joel Holdbrooks
326d861a86
Push DLMA into main, pass config to resolve
...
This patch
* passes the configuration (`argparse.Namespace`) to the resolver,
* pushes the DLMA code into the main function,
* makes DLMA take a `list[ImageTrainItem]` instead of `data_root`,
* makes `EveryDreamBatch` take `DLMA` instead of `data_root`, etc.
* allows `data_root` to be a list.
By doing these things, both `EveryDreamBatch` and DLMA can be free from
data resolution logic. It also reduces the number of arguments which
need to be passed down to EDB and DLMA.
2023-01-29 17:08:54 -08:00
Victor Hall
18d1da0459
bug fix and multiplytxt fraction stuff
2023-01-22 18:59:59 -05:00
Jan Gerritsen
711e263e24
Implemented system to train on a subset of the dataset, favouring higher rated images
2023-01-14 16:18:22 +01:00
Jan Gerritsen
3d2709ace9
Implemented loading captions from yaml file
2023-01-09 21:53:07 +01:00
Jan Gerritsen
a3618409bc
Support more control regarding caption tag shuffeling using yaml files
2023-01-09 21:53:07 +01:00
Victor Hall
98f9a7302d
shuffle tags arg
2023-01-06 19:12:52 -05:00
Victor Hall
65c5fd5ccb
json driven args, update run batch compensation
2023-01-03 14:27:26 -05:00
Victor Hall
07125135a8
safer writing of batch schedule
2023-01-02 18:02:35 -05:00
Victor Hall
b316684bdb
bunch of updates, grad ckpting, no drop bucket, shuffle every epoch
2023-01-01 10:45:18 -05:00
Victor Hall
4c53f2d55c
various tweaks and bugfixes over holidays
2022-12-27 14:25:32 -05:00
Victor Hall
8904724135
fix some quality issues
2022-12-20 03:30:42 -05:00
Victor Hall
ac6391ef87
save ever n epochs, update docs
2022-12-18 21:16:14 -05:00
Victor Hall
179fd5395b
hey look ed2
2022-12-17 22:32:48 -05:00