Pam
|
8d7fa2f67c
|
sdp_attnblock_forward hijack
|
2023-03-10 22:48:41 +05:00 |
Pam
|
0981dea948
|
sdp refactoring
|
2023-03-10 12:58:10 +05:00 |
Pam
|
37acba2633
|
argument to disable memory efficient for sdp
|
2023-03-10 12:19:36 +05:00 |
Pam
|
fec0a89511
|
scaled dot product attention
|
2023-03-07 00:33:13 +05:00 |
AUTOMATIC1111
|
dfb3b8f398
|
Merge branch 'master' into weighted-learning
|
2023-02-19 12:41:29 +03:00 |
Shondoit
|
c4bfd20f31
|
Hijack to add weighted_forward to model: return loss * weight map
|
2023-02-15 10:03:59 +01:00 |
brkirch
|
2016733814
|
Apply hijacks in ddpm_edit for upcast sampling
To avoid import errors, ddpm_edit hijacks are done after an instruct pix2pix model is loaded.
|
2023-02-07 22:53:45 -05:00 |
AUTOMATIC1111
|
fecb990deb
|
Merge pull request #7309 from brkirch/fix-embeddings
Fix embeddings, upscalers, and refactor `--upcast-sampling`
|
2023-01-28 18:44:36 +03:00 |
AUTOMATIC
|
d04e3e921e
|
automatically detect v-parameterization for SD2 checkpoints
|
2023-01-28 15:24:41 +03:00 |
brkirch
|
ada17dbd7c
|
Refactor conditional casting, fix upscalers
|
2023-01-28 04:16:25 -05:00 |
brkirch
|
c4b9b07db6
|
Fix embeddings dtype mismatch
|
2023-01-26 09:00:15 -05:00 |
AUTOMATIC
|
6073456c83
|
write a comment for fix_checkpoint function
|
2023-01-19 20:39:10 +03:00 |
AUTOMATIC
|
924e222004
|
add option to show/hide warnings
removed hiding warnings from LDSR
fixed/reworked few places that produced warnings
|
2023-01-18 23:04:24 +03:00 |
AUTOMATIC
|
085427de0e
|
make it possible for extensions/scripts to add their own embedding directories
|
2023-01-08 09:37:33 +03:00 |
AUTOMATIC1111
|
c295e4a244
|
Merge pull request #6055 from brkirch/sub-quad_attn_opt
Add Birch-san's sub-quadratic attention implementation
|
2023-01-07 12:26:55 +03:00 |
AUTOMATIC
|
79e39fae61
|
CLIP hijack rework
|
2023-01-07 01:46:13 +03:00 |
brkirch
|
5deb2a19cc
|
Allow Doggettx's cross attention opt without CUDA
|
2023-01-06 01:33:15 -05:00 |
brkirch
|
3bfe2bb549
|
Merge remote-tracking branch 'upstream/master' into sub-quad_attn_opt
|
2023-01-06 00:15:22 -05:00 |
brkirch
|
f6ab5a39d7
|
Merge branch 'AUTOMATIC1111:master' into sub-quad_attn_opt
|
2023-01-06 00:14:20 -05:00 |
brkirch
|
d782a95967
|
Add Birch-san's sub-quadratic attention implementation
|
2023-01-06 00:14:13 -05:00 |
Vladimir Mandic
|
21ee77db31
|
add cross-attention info
|
2023-01-04 08:04:38 -05:00 |
AUTOMATIC
|
f34c734172
|
alt-diffusion integration
|
2022-12-31 18:06:35 +03:00 |
AUTOMATIC
|
3f401cdb64
|
Merge remote-tracking branch 'baai-open-internal/master' into alt-diffusion
|
2022-12-31 13:02:28 +03:00 |
AUTOMATIC
|
505ec7e4d9
|
cleanup some unneeded imports for hijack files
|
2022-12-10 09:17:39 +03:00 |
AUTOMATIC
|
7dbfd8a7d8
|
do not replace entire unet for the resolution hack
|
2022-12-10 09:14:45 +03:00 |
AUTOMATIC1111
|
2641d1b83b
|
Merge pull request #4978 from aliencaocao/support_any_resolution
Patch UNet Forward to support resolutions that are not multiples of 64
|
2022-12-10 08:45:41 +03:00 |
zhaohu xing
|
5dcc22606d
|
add hash and fix undo hijack bug
Signed-off-by: zhaohu xing <920232796@qq.com>
|
2022-12-06 16:04:50 +08:00 |
Zac Liu
|
a25dfebeed
|
Merge pull request #3 from 920232796/master
fix device support for mps
update the support for SD2.0
|
2022-12-06 09:17:57 +08:00 |
Zac Liu
|
3ebf977a6e
|
Merge branch 'AUTOMATIC1111:master' into master
|
2022-12-06 09:16:15 +08:00 |
zhaohu xing
|
4929503258
|
fix bugs
Signed-off-by: zhaohu xing <920232796@qq.com>
|
2022-12-06 09:03:55 +08:00 |
AUTOMATIC
|
0d21624cee
|
move #5216 to the extension
|
2022-12-03 18:16:19 +03:00 |
AUTOMATIC
|
89e1df013b
|
Merge remote-tracking branch 'wywywywy/autoencoder-hijack'
|
2022-12-03 18:08:10 +03:00 |
AUTOMATIC1111
|
a2feaa95fc
|
Merge pull request #5194 from brkirch/autocast-and-mps-randn-fixes
Use devices.autocast() and fix MPS randn issues
|
2022-12-03 09:58:08 +03:00 |
SmirkingFace
|
da698ca92e
|
Fixed AttributeError where openaimodel is not found
|
2022-12-02 13:47:02 +01:00 |
zhaohu xing
|
52cc83d36b
|
fix bugs
Signed-off-by: zhaohu xing <920232796@qq.com>
|
2022-11-30 14:56:12 +08:00 |
zhaohu xing
|
0831ab476c
|
Merge branch 'master' into master
|
2022-11-30 10:13:17 +08:00 |
wywywywy
|
36c3613d16
|
Add autoencoder to sd_hijack
|
2022-11-29 17:40:02 +00:00 |
zhaohu xing
|
75c4511e6b
|
add AltDiffusion to webui
Signed-off-by: zhaohu xing <920232796@qq.com>
|
2022-11-29 10:28:41 +08:00 |
brkirch
|
98ca437edf
|
Refactor and instead check if mps is being used, not availability
|
2022-11-28 21:18:51 -05:00 |
AUTOMATIC
|
b48b7999c8
|
Merge remote-tracking branch 'flamelaw/master'
|
2022-11-27 12:19:59 +03:00 |
Billy Cao
|
349f0461ec
|
Merge branch 'master' into support_any_resolution
|
2022-11-27 12:39:31 +08:00 |
AUTOMATIC
|
64c7b7975c
|
restore hypernetworks to seemingly working state
|
2022-11-26 16:45:57 +03:00 |
AUTOMATIC
|
ce6911158b
|
Add support Stable Diffusion 2.0
|
2022-11-26 16:10:46 +03:00 |
Billy Cao
|
adb6cb7619
|
Patch UNet Forward to support resolutions that are not multiples of 64
Also modifed the UI to no longer step in 64
|
2022-11-23 18:11:24 +08:00 |
flamelaw
|
bd68e35de3
|
Gradient accumulation, autocast fix, new latent sampling method, etc
|
2022-11-20 12:35:26 +09:00 |
killfrenzy96
|
17e4432820
|
cleanly undo circular hijack #4818
|
2022-11-18 21:22:55 +11:00 |
AUTOMATIC
|
c62d17aee3
|
use the new devices.has_mps() function in register_buffer for DDIM/PLMS fix for OSX
|
2022-11-12 10:00:22 +03:00 |
AUTOMATIC
|
7ba3923d5b
|
move DDIM/PLMS fix for OSX out of the file with inpainting code.
|
2022-11-11 18:20:18 +03:00 |
Jairo Correa
|
af758e97fa
|
Unload sd_model before loading the other
|
2022-11-01 04:01:49 -03:00 |
AUTOMATIC
|
2b91251637
|
removed aesthetic gradients as built-in
added support for extensions
|
2022-10-22 12:23:58 +03:00 |