Don't slink LoRAs when linking ckpts

The `link_ckpts` does not filter the `Lora` directory, yet this is the directory used to store Lora which are later properly linked. This change fix the function is Lora are properly ignored when linking checkpoints.
This commit is contained in:
Pierre Buyle 2023-05-12 14:37:28 -04:00 committed by GitHub
parent 0993272b9e
commit d8771b792c
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 1 additions and 1 deletions

View File

@ -1084,7 +1084,7 @@
" print('\\nLinking .ckpt and .safetensor/.safetensors/.st files in', source_path)\n",
" source_path = Path(source_path)\n",
" for file in [p for p in source_path.rglob('*') if p.suffix in ['.ckpt', '.safetensor', '.safetensors', '.st']]:\n",
" if Path(file).parent.parts[-1] not in ['hypernetworks', 'vae'] :\n",
" if Path(file).parent.parts[-1] not in ['hypernetworks', 'vae', 'Lora'] :\n",
" if not (webui_sd_model_path / file.name):\n",
" print('New model:', file.name)\n",
" create_symlink(file, webui_sd_model_path)\n",