Commit Graph

71 Commits

Author SHA1 Message Date
Patrick von Platen c18941b01a
[Better scheduler docs] Improve usage examples of schedulers (#890)
* [Better scheduler docs] Improve usage examples of schedulers

* finish

* fix warnings and add test

* finish

* more replacements

* adapt fast tests hf token

* correct more

* Apply suggestions from code review

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>

* Integrate compatibility with euler

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
2022-10-31 17:26:30 +01:00
Nathan Lambert 12fd0736dc
clean incomplete pages (#1008) 2022-10-29 09:28:26 +02:00
Minwoo Byeon fc0ca47456
Fix speedup ratio in fp16.mdx (#837) 2022-10-29 09:26:23 +02:00
Pedro Cuenca 6b185b6acd
Update training and fine-tuning docs (#1020)
* Update training and fine-tuning docs.

* Update examples README.

* Update README.

* Add Flax fine-tuning section.

* Accept suggestion

Co-authored-by: Anton Lozhkov <anton@huggingface.co>

* Accept suggestion

Co-authored-by: Anton Lozhkov <anton@huggingface.co>

Co-authored-by: Anton Lozhkov <anton@huggingface.co>
2022-10-28 21:02:08 +02:00
Pi Esposito de00c63217
Document sequential CPU offload method on Stable Diffusion pipeline (#1024)
* document cpu offloading method

* address review comments

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
2022-10-27 16:52:21 +02:00
Ella Charlaix e2243de5f2
Fix typo in documentation title (#975) 2022-10-25 20:20:16 +02:00
Patrick von Platen 88fa6b7d68
[Dance Diffusion] Add dance diffusion (#803)
* start

* add more logic

* Update src/diffusers/models/unet_2d_condition_flax.py

* match weights

* up

* make model work

* making class more general, fixing missed file rename

* small fix

* make new conversion work

* up

* finalize conversion

* up

* first batch of variable renamings

* remove c and c_prev var names

* add mid and out block structure

* add pipeline

* up

* finish conversion

* finish

* upload

* more fixes

* Apply suggestions from code review

* add attr

* up

* uP

* up

* finish tests

* finish

* uP

* finish

* fix test

* up

* naming consistency in tests

* Apply suggestions from code review

Co-authored-by: Suraj Patil <surajp815@gmail.com>
Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
Co-authored-by: Nathan Lambert <nathan@huggingface.co>
Co-authored-by: Anton Lozhkov <anton@huggingface.co>

* remove hardcoded 16

* Remove bogus

* fix some stuff

* finish

* improve logging

* docs

* upload

Co-authored-by: Nathan Lambert <nol@berkeley.edu>
Co-authored-by: Suraj Patil <surajp815@gmail.com>
Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
Co-authored-by: Nathan Lambert <nathan@huggingface.co>
Co-authored-by: Anton Lozhkov <anton@huggingface.co>
2022-10-25 18:39:25 +02:00
Pedro Cuenca 3d02c92187
mps changes for PyTorch 1.13 (#926)
* Docs: refer to pre-RC version of PyTorch 1.13.0.

* Remove temporary workaround for unavailable op.

* Update comment to make it less ambiguous.

* Remove use of contiguous in mps.

It appears to not longer be necessary.

* Special case: use einsum for much better performance in mps

* Update mps docs.

* Minor doc update.

* Accept suggestion

Co-authored-by: Anton Lozhkov <anton@huggingface.co>

Co-authored-by: Anton Lozhkov <anton@huggingface.co>
2022-10-25 16:41:51 +02:00
Nathan Lambert 2fb8fafa4b
add community pipeline docs; add minimal text to some empty doc pages (#930)
* add community pipeline docs

* fix style in code snippets (lol)

* clean up loading docs

* add license to doc files

* fix some weird links
2022-10-24 14:20:08 -07:00
apolinario 8aac1f99d7
v1-5 docs updates (#921)
* Update README.md

Additionally add FLAX so the model card can be slimmer and point to this page

* Find and replace all

* v-1-5 -> v1-5

* revert test changes

* Update README.md

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Update docs/source/quicktour.mdx

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>

* Update README.md

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>

* Update docs/source/quicktour.mdx

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>

* Update README.md

Co-authored-by: Suraj Patil <surajp815@gmail.com>

* Revert certain references to v1-5

* Docs changes

* Apply suggestions from code review

Co-authored-by: apolinario <joaopaulo.passos+multimodal@gmail.com>
Co-authored-by: anton-l <anton@huggingface.co>
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
Co-authored-by: Suraj Patil <surajp815@gmail.com>
2022-10-24 22:50:23 +02:00
Patrick von Platen 83f8a5ff70
[Stable Diffusion] Add components function (#889)
* [Stable Diffusion] Add components function

* uP
2022-10-20 13:28:11 +02:00
Pedro Cuenca 8124863d1f
Initial docs update for new in-painting pipeline (#910)
Docs update for new in-painting pipeline.

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
2022-10-19 17:31:23 +02:00
Patrick von Platen c1b6ea3dce
Update img2img.mdx 2022-10-12 00:52:30 +02:00
Pedro Cuenca 24b8b5cf5e
`mps`: Alternative implementation for `repeat_interleave` (#766)
* mps: alt. implementation for repeat_interleave

* style

* Bump mps version of PyTorch in the documentation.

* Apply suggestions from code review

Co-authored-by: Suraj Patil <surajp815@gmail.com>

* Simplify: do not check for device.

* style

* Fix repeat dimensions:

- The unconditional embeddings are always created from a single prompt.
- I was shadowing the batch_size var.

* Split long lines as suggested by Suraj.

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
Co-authored-by: Suraj Patil <surajp815@gmail.com>
2022-10-11 20:30:09 +02:00
Omar Sanseviero 757babfcad
Fix indentation in the code example (#802)
Update custom_pipelines.mdx
2022-10-11 20:26:52 +02:00
anton-l 970e30606c Revert "[v0.4.0] Temporarily remove Flax modules from the public API (#755)"
This reverts commit 2e209c30cf.
2022-10-06 18:35:40 +02:00
Anton Lozhkov 2e209c30cf
[v0.4.0] Temporarily remove Flax modules from the public API (#755)
Temporarily remove Flax modules from the public API
2022-10-06 18:10:36 +02:00
Patrick von Platen d9c449ea30
Custome Pipelines (#744)
* [Custom Pipelines]

* uP

* make style

* finish

* finish

* remove ipdb

* upload

* fix

* finish docs

* Apply suggestions from code review

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
Co-authored-by: apolinario <joaopaulo.passos@gmail.com>

* finish

* final uploads

* remove unnecessary test

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
Co-authored-by: apolinario <joaopaulo.passos@gmail.com>
2022-10-06 16:54:02 +02:00
Patrick von Platen 4deb16e830
[Docs] Advertise fp16 instead of autocast (#740)
up
2022-10-05 22:20:53 +02:00
Suraj Patil 19e559d5e9
remove use_auth_token from remaining places (#737)
remove use_auth_token
2022-10-05 17:40:49 +02:00
Patrick von Platen 78744b6a8f
No more use_auth_token=True (#733)
* up

* uP

* uP

* make style

* Apply suggestions from code review

* up

* finish
2022-10-05 17:16:15 +02:00
Kashif Rasul 726aba089d
[Pytorch] pytorch only timesteps (#724)
* pytorch timesteps

* style

* get rid of if-else

* fix test

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
2022-10-05 12:55:51 +02:00
Yuta Hayashibe 7e92c5bc73
Fix typos (#718)
* Fix typos

* Update examples/dreambooth/train_dreambooth.py

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
2022-10-04 15:22:14 +02:00
Nouamane Tazi daa22050c7
[docs] fix table in fp16.mdx (#683) 2022-09-30 15:15:22 +02:00
Nouamane Tazi 9ebaea545f
Optimize Stable Diffusion (#371)
* initial commit

* make UNet stream capturable

* try to fix noise_pred value

* remove cuda graph and keep NB

* non blocking unet with PNDMScheduler

* make timesteps np arrays for pndm scheduler
because lists don't get formatted to tensors in `self.set_format`

* make max async in pndm

* use channel last format in unet

* avoid moving timesteps device in each unet call

* avoid memcpy op in `get_timestep_embedding`

* add `channels_last` kwarg to `DiffusionPipeline.from_pretrained`

* update TODO

* replace `channels_last` kwarg with `memory_format` for more generality

* revert the channels_last changes to leave it for another PR

* remove non_blocking when moving input ids to device

* remove blocking from all .to() operations at beginning of pipeline

* fix merging

* fix merging

* model can run in other precisions without autocast

* attn refactoring

* Revert "attn refactoring"

This reverts commit 0c70c0e189cd2c4d8768274c9fcf5b940ee310fb.

* remove restriction to run conv_norm in fp32

* use `baddbmm` instead of `matmul`for better in attention for better perf

* removing all reshapes to test perf

* Revert "removing all reshapes to test perf"

This reverts commit 006ccb8a8c6bc7eb7e512392e692a29d9b1553cd.

* add shapes comments

* hardcore whats needed for jitting

* Revert "hardcore whats needed for jitting"

This reverts commit 2fa9c698eae2890ac5f8e367ca80532ecf94df9a.

* Revert "remove restriction to run conv_norm in fp32"

This reverts commit cec592890c32da3d1b78d38b49e4307aedf459b9.

* revert using baddmm in attention's forward

* cleanup comment

* remove restriction to run conv_norm in fp32. no quality loss was noticed

This reverts commit cc9bc1339c998ebe9e7d733f910c6d72d9792213.

* add more optimizations techniques to docs

* Revert "add shapes comments"

This reverts commit 31c58eadb8892f95478cdf05229adf678678c5f4.

* apply suggestions

* make quality

* apply suggestions

* styling

* `scheduler.timesteps` are now arrays so we dont need .to()

* remove useless .type()

* use mean instead of max in `test_stable_diffusion_inpaint_pipeline_k_lms`

* move scheduler timestamps to correct device if tensors

* add device to `set_timesteps` in LMSD scheduler

* `self.scheduler.set_timesteps` now uses device arg for schedulers that accept it

* quick fix

* styling

* remove kwargs from schedulers `set_timesteps`

* revert to using max in K-LMS inpaint pipeline test

* Revert "`self.scheduler.set_timesteps` now uses device arg for schedulers that accept it"

This reverts commit 00d5a51e5c20d8d445c8664407ef29608106d899.

* move timesteps to correct device before loop in SD pipeline

* apply previous fix to other SD pipelines

* UNet now accepts tensor timesteps even on wrong device, to avoid errors
- it shouldnt affect performance if timesteps are alrdy on correct device
- it does slow down performance if they're on the wrong device

* fix pipeline when timesteps are arrays with strides
2022-09-30 09:49:13 +02:00
Tanishq Abraham f5b9bc8b49
Update index.mdx (#670) 2022-09-29 09:17:52 +02:00
Kashif Rasul bd8df2da89
[Pytorch] Pytorch only schedulers (#534)
* pytorch only schedulers

* fix style

* remove match_shape

* pytorch only ddpm

* remove SchedulerMixin

* remove numpy from karras_ve

* fix types

* remove numpy from lms_discrete

* remove numpy from pndm

* fix typo

* remove mixin and numpy from sde_vp and ve

* remove remaining tensor_format

* fix style

* sigmas has to be torch tensor

* removed set_format in readme

* remove set format from docs

* remove set_format from pipelines

* update tests

* fix typo

* continue to use mixin

* fix imports

* removed unsed imports

* match shape instead of assuming image shapes

* remove import typo

* update call to add_noise

* use math instead of numpy

* fix t_index

* removed commented out numpy tests

* timesteps needs to be discrete

* cast timesteps to int in flax scheduler too

* fix device mismatch issue

* small fix

* Update src/diffusers/schedulers/scheduling_pndm.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
2022-09-27 15:27:34 +02:00
Younes Belkada 8b0be93596
Flax documentation (#589)
* documenting `attention_flax.py` file

* documenting `embeddings_flax.py`

* documenting `unet_blocks_flax.py`

* Add new objs to doc page

* document `vae_flax.py`

* Apply suggestions from code review

* modify `unet_2d_condition_flax.py`

* make style

* Apply suggestions from code review

* make style

* Apply suggestions from code review

* fix indent

* fix typo

* fix indent unet

* Update src/diffusers/models/vae_flax.py

* Apply suggestions from code review

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>

Co-authored-by: Mishig Davaadorj <dmishig@gmail.com>
Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
2022-09-23 13:24:16 +02:00
Ryan Russell df80ccf7de
docs: `.md` readability fixups (#619)
Signed-off-by: Ryan Russell <git@ryanrussell.org>
2022-09-23 12:02:27 +02:00
Ryan Russell f149d037de
docs: fix `stochastic_karras_ve` ref (#618)
Signed-off-by: Ryan Russell <git@ryanrussell.org>

Signed-off-by: Ryan Russell <git@ryanrussell.org>
2022-09-22 18:36:29 +02:00
Yuta Hayashibe 76d492ea49
Fix typos and add Typo check GitHub Action (#483)
* Fix typos

* Add a typo check action

* Fix a bug

* Changed to manual typo check currently

Ref: https://github.com/huggingface/diffusers/pull/483#pullrequestreview-1104468010

Co-authored-by: Anton Lozhkov <aglozhkov@gmail.com>

* Removed a confusing message

* Renamed "nin_shortcut" to "in_shortcut"

* Add memo about NIN

Co-authored-by: Anton Lozhkov <aglozhkov@gmail.com>
2022-09-16 15:36:51 +02:00
Jithin James ab7a78e8f1
docs: bocken doc links for relative links (#504)
fix: bocken doc links for relative links
2022-09-14 00:50:02 +02:00
Nathan Lambert 25a51b63ca
fix table formatting for stable diffusion pipeline doc (add blank line) (#471)
fix table formatting (add blank line)
2022-09-12 10:28:27 +02:00
Partho 8eaaa546d8
Docs: fix installation typo (#453)
installation doc typo fix
2022-09-09 15:17:17 -06:00
Patrick von Platen 44968e4204
[Docs] Correct links (#432) 2022-09-08 21:29:24 +02:00
Patrick von Platen 1e98723e12 finish 2022-09-08 17:47:54 +02:00
Patrick von Platen 4e2c1f3a4d
Add config docs (#429)
* advance

* finish

* finish
2022-09-08 17:46:03 +02:00
Kashif Rasul 5e6417e988
[Docs] Models (#416)
* docs for attention

* types for embeddings

* unet2d docstrings

* UNet2DConditionModel docstrings

* fix typos

* style and vq-vae docstrings

* docstrings  for VAE

* Update src/diffusers/models/unet_2d.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* make style

* added inherits from sentence

* docstring to forward

* make style

* Apply suggestions from code review

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>

* finish model docs

* up

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
2022-09-08 17:28:11 +02:00
Patrick von Platen 234e90cca7
[Docs] Using diffusers (#428)
* [Docs] Using diffusers

* up
2022-09-08 17:27:36 +02:00
Patrick von Platen f6fb3282b1
[Outputs] Improve syntax (#423)
* [Outputs] Improve syntax

* improve more

* fix docstring return

* correct all

* uP

Co-authored-by: Mishig Davaadorj <dmishig@gmail.com>
2022-09-08 16:46:38 +02:00
Pedro Cuenca 1a79969d23
Initial ONNX doc (TODO: Installation) (#426) 2022-09-08 16:46:24 +02:00
Patrick von Platen 43c585111d
[Docs] Outputs.mdx (#422)
* up

* remove bogus file
2022-09-08 14:47:14 +02:00
Patrick von Platen 46013e8e3f
[Docs] Fix scheduler docs (#421)
* [Docs] Fix scheduler docs

* up

* Apply suggestions from code review
2022-09-08 14:04:09 +02:00
Patrick von Platen e7457b377d
[Docs] DiffusionPipeline (#418)
* Start

* up

* up

* finish
2022-09-08 13:50:06 +02:00
Patrick von Platen 98f346835a
[Docs] Minor fixes in optimization section (#420)
* uP

* more
2022-09-08 13:13:46 +02:00
Satpal Singh Rathore 6b9906f6c2
[Docs] Pipelines for inference (#417)
* Update conditional_image_generation.mdx

* Update unconditional_image_generation.mdx
2022-09-08 12:42:13 +02:00
Patrick von Platen a353c46ec0
[Docs] Training docs (#415)
finish training docs
2022-09-08 10:17:37 +02:00
Pedro Cuenca c29d81c3e3
Docs: fp16 page (#404)
* Initial version of `fp16` page.

* Fix typo in README.

* Change titles of fp16 section in toctree.

* PR suggestion

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* PR suggestion

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Clarify attention slicing is useful even for batches of 1

Explained by @patrickvonplaten after a suggestion by @keturn.

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* Do not talk about `batches` in `enable_attention_slicing`.

* Use Tip (just for fun), add link to method.

* Comment about fp16 results looking the same as float32 in practice.

* Style: docstring line wrapping.

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
2022-09-08 09:17:51 +02:00
Nathan Lambert b8894f181d
Docs fix some typos (#408)
* fix small typos

* capitalize Diffusers
2022-09-08 09:08:35 +02:00
Nathan Lambert e6110f6856
[docs sprint] schedulers docs, will update (#376)
* init schedulers docs

* add some docstrings, fix sidebar formatting

* add docstrings

* [Type hint] PNDM schedulers (#335)

* [Type hint] PNDM Schedulers

* ran make style

* updated timesteps type hint

* apply suggestions from code review

* ran make style

* removed unused import

* [Type hint] scheduling ddim (#343)

* [Type hint] scheduling ddim

* apply suggestions from code review

apply suggestions to also return the return type

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* make style

* update class docstrings

* add docstrings

* missed merge edit

* add general docs page

* modify headings for right sidebar

Co-authored-by: Partho <parthodas6176@gmail.com>
Co-authored-by: Santiago Víquez <santi.viquez@gmail.com>
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
2022-09-08 09:07:44 +02:00