Yuta Hayashibe
4b9f58952a
Add --pretrained_model_name_revision option to train_dreambooth.py ( #933 )
...
* Add --pretrained_model_name_revision option to train_dreambooth.py
* Renamed --pretrained_model_name_revision to --revision
2022-10-25 21:38:23 +02:00
Hanusz Leszek
4bf675f465
Dreambooth class image generation: using unique names to avoid overwriting existing image ( #847 )
...
* Add an underscore to filename if it already exists
* Use sha1sum hash instead of adding underscores
2022-10-20 15:56:15 +02:00
Suraj Patil
7674a36a34
[dreambooth] dont use safety check when generating prior images ( #922 )
...
dont' use safety check when generating prior images
2022-10-20 13:52:11 +02:00
Suraj Patil
fbe807bf57
[dreambooth] allow fine-tuning text encoder ( #883 )
...
* allow fine-tuning text encoder
* fix a few things
* update readme
2022-10-18 17:28:51 +02:00
Anton Lozhkov
e001fededf
Fix dreambooth loss type with prior_preservation and fp16 ( #826 )
...
Fix dreambooth loss type with prior preservation
2022-10-13 15:41:19 +02:00
spezialspezial
e895952816
Eventually preserve this typo? :) ( #804 )
2022-10-11 20:06:24 +02:00
Henrik Forstén
81bdbb5e2a
DreamBooth DeepSpeed support for under 8 GB VRAM training ( #735 )
...
* Support deepspeed
* Dreambooth DeepSpeed documentation
* Remove unnecessary casts, documentation
Due to recent commits some casts to half precision are not necessary
anymore.
Mention that DeepSpeed's version of Adam is about 2x faster.
* Review comments
2022-10-10 21:29:27 +02:00
YaYaB
906e4105d7
Fix push_to_hub for dreambooth and textual_inversion ( #748 )
...
* Fix push_to_hub for dreambooth and textual_inversion
* Use repo.push_to_hub instead of push_to_hub
2022-10-07 11:50:28 +02:00
Patrick von Platen
4deb16e830
[Docs] Advertise fp16 instead of autocast ( #740 )
...
up
2022-10-05 22:20:53 +02:00
Suraj Patil
19e559d5e9
remove use_auth_token from remaining places ( #737 )
...
remove use_auth_token
2022-10-05 17:40:49 +02:00
Pierre LeMoine
08d4fb6e9f
[dreambooth] Using already created `Path` in dataset ( #681 )
...
using already created `Path` in dataset
2022-10-05 12:14:30 +02:00
Yuta Hayashibe
7e92c5bc73
Fix typos ( #718 )
...
* Fix typos
* Update examples/dreambooth/train_dreambooth.py
Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
2022-10-04 15:22:14 +02:00
Suraj Patil
14f4af8f5b
[dreambooth] fix applying clip_grad_norm_ ( #686 )
...
fix applying clip grad norm
2022-10-03 10:54:01 +02:00
Suraj Patil
ac665b6484
[examples/dreambooth] don't pass tensor_format to scheduler. ( #649 )
...
don't pass tensor_format
2022-09-27 17:24:12 +02:00
Zhenhuan Liu
3b747de845
Add training example for DreamBooth. ( #554 )
...
* Add training example for DreamBooth.
* Fix bugs.
* Update readme and default hyperparameters.
* Reformatting code with black.
* Update for multi-gpu trianing.
* Apply suggestions from code review
* improgve sampling
* fix autocast
* improve sampling more
* fix saving
* actuallu fix saving
* fix saving
* improve dataset
* fix collate fun
* fix collate_fn
* fix collate fn
* fix key name
* fix dataset
* fix collate fn
* concat batch in collate fn
* add grad ckpt
* add option for 8bit adam
* do two forward passes for prior preservation
* Revert "do two forward passes for prior preservation"
This reverts commit 661ca4677e6dccc4ad596c2ee6ca4baad4159e95.
* add option for prior_loss_weight
* add option for clip grad norm
* add more comments
* update readme
* update readme
* Apply suggestions from code review
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
* add docstr for dataset
* update the saving logic
* Update examples/dreambooth/README.md
* remove unused imports
Co-authored-by: Suraj Patil <surajp815@gmail.com>
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
2022-09-27 15:01:18 +02:00