Commit Graph

1131 Commits

Author SHA1 Message Date
DepFA d6a599ef9b
change caption method 2022-10-10 00:07:52 +01:00
DepFA 0ac3a07eec
add caption image with overlay 2022-10-10 00:05:36 +01:00
DepFA 01fd9cf0d2
change source of step count 2022-10-09 22:17:02 +01:00
DepFA 96f1e6be59
source checkpoint hash from current checkpoint 2022-10-09 22:14:50 +01:00
DepFA 6684610510
correct case on embeddingFromB64 2022-10-09 22:06:42 +01:00
DepFA d0184b8f76
change json tensor key name 2022-10-09 22:06:12 +01:00
DepFA 5d12ec82d3
add encoder and decoder classes 2022-10-09 22:05:09 +01:00
DepFA 969bd8256e
add alternate checkpoint hash source 2022-10-09 22:02:28 +01:00
DepFA 03694e1f99
add embedding load and save from b64 json 2022-10-09 21:58:14 +01:00
DepFA fa0c5eb81b
Add pretty image captioning functions 2022-10-09 20:41:22 +01:00
DepFA cd8673bd9b
add embed embedding to ui 2022-10-09 05:40:57 +01:00
DepFA 5841990b0d
Update textual_inversion.py 2022-10-09 05:38:38 +01:00
AUTOMATIC 050a6a798c support loading .yaml config with same name as model
support EMA weights in processing (????)
2022-10-08 23:26:48 +03:00
Aidan Holland 432782163a chore: Fix typos 2022-10-08 22:42:30 +03:00
Edouard Leurent 610a7f4e14 Break after finding the local directory of stable diffusion
Otherwise, we may override it with one of the next two path (. or ..) if it is present there, and then the local paths of other modules (taming transformers, codeformers, etc.) wont be found in sd_path/../.

Fix https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/1085
2022-10-08 22:35:04 +03:00
AUTOMATIC 3b2141c5fb add 'Ignore last layers of CLIP model' option as a parameter to the infotext 2022-10-08 22:21:15 +03:00
AUTOMATIC e6e42f98df make --force-enable-xformers work without needing --xformers 2022-10-08 22:12:23 +03:00
Fampai 1371d7608b Added ability to ignore last n layers in FrozenCLIPEmbedder 2022-10-08 22:10:37 +03:00
DepFA b458fa48fe Update ui.py 2022-10-08 20:38:35 +03:00
DepFA 15c4278f1a TI preprocess wording
I had to check the code to work out what splitting was 🤷🏿
2022-10-08 20:38:35 +03:00
AUTOMATIC 3061cdb7b6 add --force-enable-xformers option and also add messages to console regarding cross attention optimizations 2022-10-08 19:22:15 +03:00
AUTOMATIC f9c5da1592 add fallback for xformers_attnblock_forward 2022-10-08 19:05:19 +03:00
Artem Zagidulin a5550f0213 alternate prompt 2022-10-08 18:12:19 +03:00
DepFA 34acad1628 Add GZipMiddleware to root demo 2022-10-08 18:03:16 +03:00
C43H66N12O12S2 cc0258aea7 check for ampere without destroying the optimizations. again. 2022-10-08 17:54:16 +03:00
C43H66N12O12S2 017b6b8744 check for ampere 2022-10-08 17:54:16 +03:00
C43H66N12O12S2 7e639cd498 check for 3.10 2022-10-08 17:54:16 +03:00
AUTOMATIC cfc33f99d4 why did you do this 2022-10-08 17:29:06 +03:00
Milly 4f33289d0f Fixed typo 2022-10-08 17:15:30 +03:00
AUTOMATIC 27032c47df restore old opt_split_attention/disable_opt_split_attention logic 2022-10-08 17:10:05 +03:00
AUTOMATIC dc1117233e simplify xfrmers options: --xformers to enable and that's it 2022-10-08 17:02:18 +03:00
AUTOMATIC 7ff1170a2e emergency fix for xformers (continue + shared) 2022-10-08 16:33:39 +03:00
AUTOMATIC1111 48feae37ff
Merge pull request #1851 from C43H66N12O12S2/flash
xformers attention
2022-10-08 16:29:59 +03:00
C43H66N12O12S2 970de9ee68
Update sd_hijack.py 2022-10-08 16:29:43 +03:00
C43H66N12O12S2 7ffea15078
Update requirements_versions.txt 2022-10-08 16:24:06 +03:00
C43H66N12O12S2 ca5f0f149c
Update launch.py 2022-10-08 16:22:38 +03:00
C43H66N12O12S2 69d0053583
update sd_hijack_opt to respect new env variables 2022-10-08 16:21:40 +03:00
C43H66N12O12S2 ddfa9a9786
add xformers_available shared variable 2022-10-08 16:20:41 +03:00
C43H66N12O12S2 26b459a379
default to split attention if cuda is available and xformers is not 2022-10-08 16:20:04 +03:00
C43H66N12O12S2 d0e85873ac
check for OS and env variable 2022-10-08 16:13:26 +03:00
MrCheeze 5f85a74b00 fix bug where when using prompt composition, hijack_comments generated before the final AND will be dropped 2022-10-08 15:48:04 +03:00
guaneec 32e428ff19 Remove duplicate event listeners 2022-10-08 15:47:24 +03:00
ddPn08 772db721a5 fix glob path in hypernetwork.py 2022-10-08 15:46:54 +03:00
AUTOMATIC 7001bffe02 fix AND broken for long prompts 2022-10-08 15:43:25 +03:00
AUTOMATIC 77f4237d1c fix bugs related to variable prompt lengths 2022-10-08 15:25:59 +03:00
C43H66N12O12S2 3f166be1b6
Update requirements.txt 2022-10-08 14:42:50 +03:00
C43H66N12O12S2 4201fd14f5
install xformers 2022-10-08 14:42:34 +03:00
AUTOMATIC 4999eb2ef9 do not let user choose his own prompt token count limit 2022-10-08 14:25:47 +03:00
Trung Ngo 00117a07ef check specifically for skipped 2022-10-08 13:40:39 +03:00
Trung Ngo 786d9f63aa Add button to skip the current iteration 2022-10-08 13:40:39 +03:00