Commit Graph

172 Commits

Author SHA1 Message Date
brkirch ada17dbd7c Refactor conditional casting, fix upscalers 2023-01-28 04:16:25 -05:00
brkirch c4b9b07db6 Fix embeddings dtype mismatch 2023-01-26 09:00:15 -05:00
AUTOMATIC 6073456c83 write a comment for fix_checkpoint function 2023-01-19 20:39:10 +03:00
AUTOMATIC 924e222004 add option to show/hide warnings
removed hiding warnings from LDSR
fixed/reworked few places that produced warnings
2023-01-18 23:04:24 +03:00
AUTOMATIC 085427de0e make it possible for extensions/scripts to add their own embedding directories 2023-01-08 09:37:33 +03:00
AUTOMATIC1111 c295e4a244
Merge pull request #6055 from brkirch/sub-quad_attn_opt
Add Birch-san's sub-quadratic attention implementation
2023-01-07 12:26:55 +03:00
AUTOMATIC 79e39fae61 CLIP hijack rework 2023-01-07 01:46:13 +03:00
brkirch 5deb2a19cc Allow Doggettx's cross attention opt without CUDA 2023-01-06 01:33:15 -05:00
brkirch 3bfe2bb549 Merge remote-tracking branch 'upstream/master' into sub-quad_attn_opt 2023-01-06 00:15:22 -05:00
brkirch f6ab5a39d7 Merge branch 'AUTOMATIC1111:master' into sub-quad_attn_opt 2023-01-06 00:14:20 -05:00
brkirch d782a95967 Add Birch-san's sub-quadratic attention implementation 2023-01-06 00:14:13 -05:00
Vladimir Mandic 21ee77db31
add cross-attention info 2023-01-04 08:04:38 -05:00
AUTOMATIC f34c734172 alt-diffusion integration 2022-12-31 18:06:35 +03:00
AUTOMATIC 3f401cdb64 Merge remote-tracking branch 'baai-open-internal/master' into alt-diffusion 2022-12-31 13:02:28 +03:00
AUTOMATIC 505ec7e4d9 cleanup some unneeded imports for hijack files 2022-12-10 09:17:39 +03:00
AUTOMATIC 7dbfd8a7d8 do not replace entire unet for the resolution hack 2022-12-10 09:14:45 +03:00
AUTOMATIC1111 2641d1b83b
Merge pull request #4978 from aliencaocao/support_any_resolution
Patch UNet Forward to support resolutions that are not multiples of 64
2022-12-10 08:45:41 +03:00
zhaohu xing 5dcc22606d add hash and fix undo hijack bug
Signed-off-by: zhaohu xing <920232796@qq.com>
2022-12-06 16:04:50 +08:00
Zac Liu a25dfebeed
Merge pull request #3 from 920232796/master
fix device support for mps
update the support for SD2.0
2022-12-06 09:17:57 +08:00
Zac Liu 3ebf977a6e
Merge branch 'AUTOMATIC1111:master' into master 2022-12-06 09:16:15 +08:00
zhaohu xing 4929503258 fix bugs
Signed-off-by: zhaohu xing <920232796@qq.com>
2022-12-06 09:03:55 +08:00
AUTOMATIC 0d21624cee move #5216 to the extension 2022-12-03 18:16:19 +03:00
AUTOMATIC 89e1df013b Merge remote-tracking branch 'wywywywy/autoencoder-hijack' 2022-12-03 18:08:10 +03:00
AUTOMATIC1111 a2feaa95fc
Merge pull request #5194 from brkirch/autocast-and-mps-randn-fixes
Use devices.autocast() and fix MPS randn issues
2022-12-03 09:58:08 +03:00
SmirkingFace da698ca92e Fixed AttributeError where openaimodel is not found 2022-12-02 13:47:02 +01:00
zhaohu xing 52cc83d36b fix bugs
Signed-off-by: zhaohu xing <920232796@qq.com>
2022-11-30 14:56:12 +08:00
zhaohu xing 0831ab476c
Merge branch 'master' into master 2022-11-30 10:13:17 +08:00
wywywywy 36c3613d16
Add autoencoder to sd_hijack 2022-11-29 17:40:02 +00:00
zhaohu xing 75c4511e6b add AltDiffusion to webui
Signed-off-by: zhaohu xing <920232796@qq.com>
2022-11-29 10:28:41 +08:00
brkirch 98ca437edf Refactor and instead check if mps is being used, not availability 2022-11-28 21:18:51 -05:00
AUTOMATIC b48b7999c8 Merge remote-tracking branch 'flamelaw/master' 2022-11-27 12:19:59 +03:00
Billy Cao 349f0461ec
Merge branch 'master' into support_any_resolution 2022-11-27 12:39:31 +08:00
AUTOMATIC 64c7b7975c restore hypernetworks to seemingly working state 2022-11-26 16:45:57 +03:00
AUTOMATIC ce6911158b Add support Stable Diffusion 2.0 2022-11-26 16:10:46 +03:00
Billy Cao adb6cb7619 Patch UNet Forward to support resolutions that are not multiples of 64
Also modifed the UI to no longer step in 64
2022-11-23 18:11:24 +08:00
flamelaw bd68e35de3 Gradient accumulation, autocast fix, new latent sampling method, etc 2022-11-20 12:35:26 +09:00
killfrenzy96 17e4432820 cleanly undo circular hijack #4818 2022-11-18 21:22:55 +11:00
AUTOMATIC c62d17aee3 use the new devices.has_mps() function in register_buffer for DDIM/PLMS fix for OSX 2022-11-12 10:00:22 +03:00
AUTOMATIC 7ba3923d5b move DDIM/PLMS fix for OSX out of the file with inpainting code. 2022-11-11 18:20:18 +03:00
Jairo Correa af758e97fa Unload sd_model before loading the other 2022-11-01 04:01:49 -03:00
AUTOMATIC 2b91251637 removed aesthetic gradients as built-in
added support for extensions
2022-10-22 12:23:58 +03:00
AUTOMATIC 9286fe53de make aestetic embedding ciompatible with prompts longer than 75 tokens 2022-10-21 16:38:06 +03:00
AUTOMATIC 7d6b388d71 Merge branch 'ae' 2022-10-21 13:35:01 +03:00
C43H66N12O12S2 73b5dbf72a Update sd_hijack.py 2022-10-18 11:53:04 +03:00
C43H66N12O12S2 786ed49922 use legacy attnblock 2022-10-18 11:53:04 +03:00
MalumaDev 9324cdaa31 ui fix, re organization of the code 2022-10-16 17:53:56 +02:00
MalumaDev e4f8b5f00d ui fix 2022-10-16 10:28:21 +02:00
MalumaDev 523140d780 ui fix 2022-10-16 10:23:30 +02:00
MalumaDev b694bba39a Merge remote-tracking branch 'origin/test_resolve_conflicts' into test_resolve_conflicts 2022-10-16 00:24:05 +02:00
MalumaDev 9325c85f78 fixed dropbox update 2022-10-16 00:23:47 +02:00
MalumaDev 97ceaa23d0
Merge branch 'master' into test_resolve_conflicts 2022-10-16 00:06:36 +02:00
C43H66N12O12S2 529afbf4d7 Update sd_hijack.py 2022-10-15 20:25:27 +03:00
MalumaDev 37d7ffb415 fix to tokens lenght, addend embs generator, add new features to edit the embedding before the generation using text 2022-10-15 15:59:37 +02:00
MalumaDev bb57f30c2d init 2022-10-14 10:56:41 +02:00
AUTOMATIC 429442f4a6 fix iterator bug for #2295 2022-10-12 13:38:03 +03:00
hentailord85ez 80f3cf2bb2 Account when lines are mismatched 2022-10-12 11:38:41 +03:00
brkirch 98fd5cde72 Add check for psutil 2022-10-11 17:24:00 +03:00
brkirch c0484f1b98 Add cross-attention optimization from InvokeAI
* Add cross-attention optimization from InvokeAI (~30% speed improvement on MPS)
* Add command line option for it
* Make it default when CUDA is unavailable
2022-10-11 17:24:00 +03:00
AUTOMATIC 873efeed49 rename hypernetwork dir to hypernetworks to prevent clash with an old filename that people who use zip instead of git clone will have 2022-10-11 15:51:30 +03:00
AUTOMATIC 5de806184f Merge branch 'master' into hypernetwork-training 2022-10-11 11:14:36 +03:00
hentailord85ez 5e2627a1a6
Comma backtrack padding (#2192)
Comma backtrack padding
2022-10-11 09:55:28 +03:00
C43H66N12O12S2 623251ce2b allow pascal onwards 2022-10-10 19:54:07 +03:00
hentailord85ez d5c14365fd Add back in output hidden states parameter 2022-10-10 18:54:48 +03:00
hentailord85ez 460bbae587 Pad beginning of textual inversion embedding 2022-10-10 18:54:48 +03:00
hentailord85ez b340439586 Unlimited Token Works
Unlimited tokens actually work now. Works with textual inversion too. Replaces the previous not-so-much-working implementation.
2022-10-10 18:54:48 +03:00
Fampai 1824e9ee3a Removed unnecessary tmp variable 2022-10-09 22:31:23 +03:00
Fampai ad3ae44108 Updated code for legibility 2022-10-09 22:31:23 +03:00
Fampai e59c66c008 Optimized code for Ignoring last CLIP layers 2022-10-09 22:31:23 +03:00
Fampai 1371d7608b Added ability to ignore last n layers in FrozenCLIPEmbedder 2022-10-08 22:10:37 +03:00
AUTOMATIC 3061cdb7b6 add --force-enable-xformers option and also add messages to console regarding cross attention optimizations 2022-10-08 19:22:15 +03:00
C43H66N12O12S2 cc0258aea7 check for ampere without destroying the optimizations. again. 2022-10-08 17:54:16 +03:00
C43H66N12O12S2 017b6b8744 check for ampere 2022-10-08 17:54:16 +03:00
AUTOMATIC cfc33f99d4 why did you do this 2022-10-08 17:29:06 +03:00
AUTOMATIC 27032c47df restore old opt_split_attention/disable_opt_split_attention logic 2022-10-08 17:10:05 +03:00
AUTOMATIC dc1117233e simplify xfrmers options: --xformers to enable and that's it 2022-10-08 17:02:18 +03:00
AUTOMATIC1111 48feae37ff
Merge pull request #1851 from C43H66N12O12S2/flash
xformers attention
2022-10-08 16:29:59 +03:00
C43H66N12O12S2 970de9ee68
Update sd_hijack.py 2022-10-08 16:29:43 +03:00
C43H66N12O12S2 26b459a379
default to split attention if cuda is available and xformers is not 2022-10-08 16:20:04 +03:00
MrCheeze 5f85a74b00 fix bug where when using prompt composition, hijack_comments generated before the final AND will be dropped 2022-10-08 15:48:04 +03:00
AUTOMATIC 77f4237d1c fix bugs related to variable prompt lengths 2022-10-08 15:25:59 +03:00
AUTOMATIC 4999eb2ef9 do not let user choose his own prompt token count limit 2022-10-08 14:25:47 +03:00
AUTOMATIC 706d5944a0 let user choose his own prompt token count limit 2022-10-08 13:38:57 +03:00
C43H66N12O12S2 91d66f5520
use new attnblock for xformers path 2022-10-08 11:56:01 +03:00
C43H66N12O12S2 b70eaeb200
delete broken and unnecessary aliases 2022-10-08 04:10:35 +03:00
AUTOMATIC 12c4d5c6b5 hypernetwork training mk1 2022-10-07 23:22:22 +03:00
AUTOMATIC f7c787eb7c make it possible to use hypernetworks without opt split attention 2022-10-07 16:39:51 +03:00
C43H66N12O12S2 5e3ff846c5
Update sd_hijack.py 2022-10-07 06:38:01 +03:00
C43H66N12O12S2 5303df2428
Update sd_hijack.py 2022-10-07 06:01:14 +03:00
C43H66N12O12S2 35d6b23162
Update sd_hijack.py 2022-10-07 05:31:53 +03:00
C43H66N12O12S2 2eb911b056
Update sd_hijack.py 2022-10-07 05:22:28 +03:00
Jairo Correa ad0cc85d1f Merge branch 'master' into stable 2022-10-02 18:31:19 -03:00
AUTOMATIC 88ec0cf557 fix for incorrect embedding token length calculation (will break seeds that use embeddings, you're welcome!)
add option to input initialization text for embeddings
2022-10-02 19:40:51 +03:00
AUTOMATIC 820f1dc96b initial support for training textual inversion 2022-10-02 15:03:39 +03:00
Jairo Correa ad1fbbae93 Merge branch 'master' into fix-vram 2022-09-30 18:58:51 -03:00
AUTOMATIC 98cc6c6e74 add embeddings dir 2022-09-30 14:16:26 +03:00
AUTOMATIC c715ef04d1 fix for incorrect model weight loading for #814 2022-09-29 15:40:28 +03:00
AUTOMATIC c1c27dad3b new implementation for attention/emphasis 2022-09-29 11:31:48 +03:00
Jairo Correa c2d5b29040 Move silu to sd_hijack 2022-09-29 01:16:25 -03:00
Liam e5707b66d6 switched the token counter to use hidden buttons instead of api call 2022-09-27 19:29:53 -04:00
Liam 5034f7d759 added token counter next to txt2img and img2img prompts 2022-09-27 15:56:18 -04:00