Don't slink LoRAs when linking ckpts
The `link_ckpts` does not filter the `Lora` directory, yet this is the directory used to store Lora which are later properly linked. This change fix the function is Lora are properly ignored when linking checkpoints.
This commit is contained in:
parent
0993272b9e
commit
d8771b792c
|
@ -1084,7 +1084,7 @@
|
|||
" print('\\nLinking .ckpt and .safetensor/.safetensors/.st files in', source_path)\n",
|
||||
" source_path = Path(source_path)\n",
|
||||
" for file in [p for p in source_path.rglob('*') if p.suffix in ['.ckpt', '.safetensor', '.safetensors', '.st']]:\n",
|
||||
" if Path(file).parent.parts[-1] not in ['hypernetworks', 'vae'] :\n",
|
||||
" if Path(file).parent.parts[-1] not in ['hypernetworks', 'vae', 'Lora'] :\n",
|
||||
" if not (webui_sd_model_path / file.name):\n",
|
||||
" print('New model:', file.name)\n",
|
||||
" create_symlink(file, webui_sd_model_path)\n",
|
||||
|
|
Loading…
Reference in New Issue