Commit Graph

1638 Commits

Author SHA1 Message Date
Patrick von Platen ce1c27adc8
[Revision] Don't recommend using revision (#1764) 2022-12-19 16:25:41 +01:00
Patrick von Platen b267d28566
[Versatile] fix attention mask (#1763) 2022-12-19 15:58:39 +01:00
Anton Lozhkov c7b4acfb37
Add CPU offloading to UnCLIP (#1761)
* Add CPU offloading to UnCLIP

* use fp32 for testing the offload
2022-12-19 14:44:08 +01:00
Suraj Patil be38b2d711
[UnCLIPPipeline] fix num_images_per_prompt (#1762)
duplicate maks for num_images_per_prompt
2022-12-19 14:32:46 +01:00
Anton Lozhkov 32a5d70c42
Support attn2==None for xformers (#1759) 2022-12-19 12:43:30 +01:00
Patrick von Platen 429e5449c1
Add attention mask to uclip (#1756)
* Remove bogus file

* [Unclip] Add efficient attention

* [Unclip] Add efficient attention
2022-12-19 12:10:46 +01:00
Anton Lozhkov dc7cd893fd
Add resnet_time_scale_shift to VD layers (#1757) 2022-12-19 12:01:46 +01:00
Mikołaj Siedlarek 8890758823
Correct help text for scheduler_type flag in scripts. (#1749) 2022-12-19 11:27:23 +01:00
Will Berman b25843e799
unCLIP docs (#1754)
* [unCLIP docs] markdown

* [unCLIP docs] UnCLIPPipeline
2022-12-19 10:27:32 +01:00
Will Berman 830a9d1f01
[fix] pipeline_unclip generator (#1751)
* [fix] pipeline_unclip generator

pass generator to all schedulers

* fix fast tests test data
2022-12-19 10:27:18 +01:00
Will Berman 2dcf64b72a
kakaobrain unCLIP (#1428)
* [wip] attention block updates

* [wip] unCLIP unet decoder and super res

* [wip] unCLIP prior transformer

* [wip] scheduler changes

* [wip] text proj utility class

* [wip] UnCLIPPipeline

* [wip] kakaobrain unCLIP convert script

* [unCLIP pipeline] fixes re: @patrickvonplaten

remove callbacks

move denoising loops into call function

* UNCLIPScheduler re: @patrickvonplaten

Revert changes to DDPMScheduler. Make UNCLIPScheduler, a modified
DDPM scheduler with changes to support karlo

* mask -> attention_mask re: @patrickvonplaten

* [DDPMScheduler] remove leftover change

* [docs] PriorTransformer

* [docs] UNet2DConditionModel and UNet2DModel

* [nit] UNCLIPScheduler -> UnCLIPScheduler

matches existing unclip naming better

* [docs] SchedulingUnCLIP

* [docs] UnCLIPTextProjModel

* refactor

* finish licenses

* rename all to attention_mask and prep in models

* more renaming

* don't expose unused configs

* final renaming fixes

* remove x attn mask when not necessary

* configure kakao script to use new class embedding config

* fix copies

* [tests] UnCLIPScheduler

* finish x attn

* finish

* remove more

* rename condition blocks

* clean more

* Apply suggestions from code review

* up

* fix

* [tests] UnCLIPPipelineFastTests

* remove unused imports

* [tests] UnCLIPPipelineIntegrationTests

* correct

* make style

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
2022-12-18 15:15:30 -08:00
Patrick von Platen 402b9560b2
Remove license accept ticks 2022-12-19 00:10:17 +01:00
Anton Lozhkov c2a38ef9df
Fix/update the LDM pipeline and tests (#1743)
* Fix/update LDM tests

* batched generators
2022-12-18 11:49:53 +01:00
Anton Lozhkov 08cc36ddff
Fix MPS fast test warnings (#1744)
* unset level
2022-12-17 22:57:30 +01:00
Peter 723e8f6bb4
Fix ONNX img2img preprocessing (#1736)
Co-authored-by: Peter <peterto@users.noreply.github.com>
2022-12-17 13:12:10 +01:00
Patrick von Platen c53a850604
[Batched Generators] This PR adds generators that are useful to make batched generation fully reproducible (#1718)
* [Batched Generators] all batched generators

* up

* up

* up

* up

* up

* up

* up

* up

* up

* up

* up

* up

* up

* up

* up

* up

* hey

* up again

* fix tests

* Apply suggestions from code review

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>

* correct tests

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
2022-12-17 11:13:16 +01:00
Anton Lozhkov 086c7f9ea8
Nightly integration tests (#1664)
* [WIP] Nightly integration tests

* initial SD tests

* update SD slow tests

* style

* repaint

* ImageVariations

* style

* finish imgvar

* img2img tests

* debug

* inpaint 1.5

* inpaint legacy

* torch isn't happy about deterministic ops

* allclose -> max diff for shorter logs

* add SD2

* debug

* Update tests/pipelines/stable_diffusion_2/test_stable_diffusion.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Update tests/pipelines/stable_diffusion/test_stable_diffusion.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* fix refs

* Update src/diffusers/utils/testing_utils.py

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>

* fix refs

* remove debug

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
2022-12-16 18:51:11 +01:00
Pedro Cuenca acd317810b
Docs: recommend xformers (#1724)
* Fix links to flash attention.

* Add xformers installation instructions.

* Make link to xformers install more prominent.

* Link to xformers install from training docs.
2022-12-16 15:49:01 +01:00
Patrick von Platen c6d0dff4a3
Fix ldm tests on master by not running the CPU tests on GPU (#1729) 2022-12-16 15:28:40 +01:00
Anton Lozhkov a40095dd22
Fix ONNX img2img preprocessing and add fast tests coverage (#1727)
* Fix ONNX img2img preprocessing and add fast tests coverage

* revert

* disable progressbars
2022-12-16 15:24:16 +01:00
Partho 727434c206
Accept latents as optional input in Latent Diffusion pipeline (#1723)
* Latent Diffusion pipeline accept latents

* make style

* check for mps

randn does not work reproducibly on mps
2022-12-16 12:13:41 +01:00
YiYi Xu 21e61eb3a9
Added a README page for docs and a "schedulers" page (#1710)
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
2022-12-15 13:04:40 -10:00
Haihao Shen c891330f79
Add examples with Intel optimizations (#1579)
* Add examples with Intel optimizations (BF16 fine-tuning and inference)

* Remove unused package

* Add README for intel_opts and refine the description for research projects

* Add notes of intel opts for diffusers
2022-12-15 21:16:27 +01:00
jiqing-feng c5f04d4e34
apply amp bf16 on textual inversion (#1465)
* add conf.yaml

* enable bf16

enable amp bf16 for unet forward

fix style

fix readme

remove useless file

* change amp to full bf16

* align

* make stype

* fix format
2022-12-15 21:15:23 +01:00
CyberMeow 61dec53356
Improve pipeline_stable_diffusion_inpaint_legacy.py (#1585)
* update inpaint_legacy to allow the use of predicted noise to construct intermediate diffused images

* Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
2022-12-15 20:59:31 +01:00
Pedro Cuenca badddee0ef
Add state checkpointing to other training scripts (#1687)
* Add state checkpointing to other training scripts

* Fix first_epoch

* Apply suggestions from code review

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Update Dreambooth checkpoint help message.

* Dreambooth docs: checkpoints, inference from a checkpoint.

* make style

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
2022-12-15 19:49:40 +01:00
Anton Lozhkov 13994b2d3f
RePaint fast tests and API conforming (#1701)
* add fast tests

* better tests and fp16

* batch fixes

* Reuse preprocessing

* quickfix
2022-12-15 18:35:31 +01:00
anton- ea90bf2ba1 skip mps 2022-12-15 18:01:39 +01:00
Chino 8cecc66a74
Fix the bug that torch version less than 1.12 throws TypeError (#1671) 2022-12-14 21:29:39 +01:00
Anton Lozhkov 35b66c8e32
[Readme] Clarify package owners (#1707)
Specify that we don't actively monitor the conda scripts
2022-12-14 20:49:36 +01:00
Patrick von Platen 013edb641a
Update main docs (#1706)
* Remove bogus file

* [Docs] Remove mentioning of gated access since no longer exsits

* add docs to index

* Apply suggestions from code review

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
2022-12-14 20:33:54 +01:00
Anton Lozhkov 86ac3ea1d7
Delete _ 2022-12-14 13:52:29 +01:00
Anton Lozhkov ef3fcbb688
Remove all local telemetry (#1702) 2022-12-14 12:56:35 +01:00
Prathik Rao 7c823c2ed7
manually update train_unconditional_ort (#1694)
* manually update train_unconditional_ort

* formatting

Co-authored-by: Prathik Rao <prathikrao@microsoft.com>
2022-12-14 11:35:41 +01:00
Pedro Cuenca 784beee969
Dreambooth: use warnings instead of logger in parse_args() (#1688)
Use warnings instead of logger in parse_args()

logger requires an `Accelerator`.
2022-12-13 22:01:48 +01:00
Patrick von Platen 8b7cb962a5 make style 2022-12-13 17:01:18 +00:00
Patrick von Platen e1bb8f6188
[Community pipeline] Add github mechanism (#1680)
* [Community pipeline] Add github mechanism

* better

* Apply suggestions from code review

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>

* adapt

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
2022-12-13 18:01:00 +01:00
Patrick von Platen e62dd5cfa8
Change one-step dummy pipeline for testing (#1690)
Change the one-step dummy pipeline for testing
2022-12-13 16:55:49 +01:00
w4ffl35 07f95503e5
Disable telemetry when DISABLE_TELEMETRY is set (#1686)
fixed #1685 - disables telemetry when DISABLE_TELEMETRY and HF_HUB_OFFLINE is set
2022-12-13 16:28:07 +01:00
Pedro Cuenca e01d6cf295
Dreambooth: save / restore training state (#1668)
* Dreambooth: save / restore training state.

* make style

* Rename vars for clarity.

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Remove unused import

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
2022-12-13 15:16:44 +01:00
Patrick von Platen 244e16a7ab
[Version] Bump to 0.11.0.dev0 (#1682)
upgrade version
2022-12-13 13:51:36 +01:00
Patrick von Platen b345c74d4d
Make sure all pipelines can run with batched input (#1669)
* [SD] Make sure batched input works correctly

* uP

* uP

* up

* up

* uP

* up

* fix mask stuff

* up

* uP

* more up

* up

* uP

* up

* finish

* Apply suggestions from code review

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
2022-12-13 12:50:15 +01:00
apolinario b417042291
Fix wrong type checking in `convert_diffusers_to_original_stable_diffusion.py` (#1681)
* Fix type checking remainders

* Remove IS_V20_MODEL flag always being True

Co-authored-by: apolinario <joaopaulo.passos+multimodal@gmail.com>
2022-12-13 12:44:20 +01:00
Suvaditya Mukherjee 40c16ed2f0
Added Community pipeline for comparing Stable Diffusion v1.1-4 checkpoints (#1584)
* Added Community pipeline for comparing Stable Diffusion v1.1-4

Signed-off-by: Suvaditya Mukherjee <suvadityamuk@gmail.com>

* Made changes to provide support for current iteration of from_pretrained and added example

Signed-off-by: Suvaditya Mukherjee <suvadityamuk@gmail.com>

* updated a small spelling error

Signed-off-by: Suvaditya Mukherjee <suvadityamuk@gmail.com>

* added pipeline entry to table

Signed-off-by: Suvaditya Mukherjee <suvadityamuk@gmail.com>

Signed-off-by: Suvaditya Mukherjee <suvadityamuk@gmail.com>
2022-12-13 11:31:30 +01:00
Patrick von Platen 69de9b2eaa
[Textual Inversion] Do not update other embeddings (#1665) 2022-12-12 17:44:39 +01:00
Patrick von Platen 3ce6380d3a
[SD] Make sure scheduler is correct when converting (#1667) 2022-12-12 16:57:48 +01:00
Cyberes d2dc4de303
Handle missing global_step key in scripts/convert_original_stable_diffusion_to_diffusers.py (#1612)
handle missing global_step key and don't download config if it already exists
2022-12-12 16:10:52 +01:00
Kangfu Mei ded3299d68
fix bug if we don't do_classifier_free_guidance (#1601)
* fix bug if we don't do_classifier_free_guidance

* update for copied diffusers.pipelines..alt_diffusion..pipeline_alt_diffusion.AltDiffusionPipeline
2022-12-12 15:02:13 +01:00
Patrick von Platen 8bf5e59931
Deprecate init image correctly (#1649)
Deprecate init image correctl
2022-12-12 15:00:20 +01:00
Prathik Rao 4645e28355
tensor format ort bug fix (#1557)
bug fix

Co-authored-by: Prathik Rao <prathikrao@microsoft.com>
Co-authored-by: anton- <anton@huggingface.co>
2022-12-12 13:56:02 +01:00