{ "cells": [ { "cell_type": "markdown", "metadata": { "id": "gfKvWAVnz8OB", "tags": [] }, "source": [ "# AUTOMATIC1111's Stable Diffusion WebUI\n", "\n", "https://github.com/AUTOMATIC1111/stable-diffusion-webui\n", "\n", "Loosely based on https://colab.research.google.com/drive/1kw3egmSn-KgWsikYvOMjJkVDsPLjEMzl\n", "\n", "**Guides**\n", "- [Getting started on Paperspace](https://github.com/Engineer-of-Stuff/stable-diffusion-paperspace/blob/main/Docs/Paperspace%20Guide%20for%20Retards.md)\n", "- [Using the WebUI](https://rentry.org/voldy)\n", "- [Using the Inpainter](https://rentry.org/drfar)\n", "- [Textual Inversion](https://rentry.org/aikgx)\n", "- [Crowd-Sourced Prompts](https://lexica.art/)\n", "- [Artist Name Prompts](https://sgreens.notion.site/sgreens/4ca6f4e229e24da6845b6d49e6b08ae7?v=fdf861d1c65d456e98904fe3f3670bd3)\n", "- [Stable Diffusion Models](https://cyberes.github.io/stable-diffusion-models)\n", "- [Textual Inversion Models](https://cyberes.github.io/stable-diffusion-textual-inversion-models/)\n", "- [Have I Been Trained?](https://haveibeentrained.com/)" ] }, { "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "## Installation and Setup\n", "\n", "You must reinstall everything each time you restart the machine. If already downloaded, dependencies will be auto-updated." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Where to store the models**\n", "\n", "`/storage/` is persistent storage shared across all machines on your account.\n", "\n", "`/notebooks/` is storage for this notebook only.\n", "\n", "`/tmp/` is not a persistent directory, meaning your files there will be deleted when the machine turns off.\n", "\n", "
\n", "\n", "If you are having storage issues, set `repo_storage_dir` to `/tmp/stable-diffusion`.\n", "\n", "
\n", "\n", "You must uncomment the correct section and run the block below or else the notebook won't work!" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Free tier\n", "# free_tier = True # Enables the creation of symlinks back to /notebooks/\n", "# model_storage_dir = '/tmp/stable-diffusion/models' # Where the models will be downloaded to.\n", "# repo_storage_dir = '/notebooks' # Where the repository will be downloaded to.\n", "\n", "# Paid Tier\n", "# free_tier = False\n", "# model_storage_dir = '/storage/models'\n", "# repo_storage_dir = '/notebooks'\n", "\n", "\n", "activate_xformers = False # Enables the xformers optimizations using pre-built wheels.\n", " # Setting to True will automatically set up your environment/machine for xformers. \n", "\n", "link_novelai_anime_vae = False # Enables the linking of animevae.pt to each of the NovelAI models.\n", " # Set to True if you've downloaded both the NovelAI models and hypernetworks.\n", " \n", "download_scripts = False # Download custom scripts? Only reason why you would leave it disabled is because it may\n", " # take a while to complete.\n", "\n", "activate_deepdanbooru = False # Enable and install DeepDanbooru -> https://github.com/KichangKim/DeepDanbooru\n", "\n", "# Don't put a trailing slash on directory paths.\n", "# To reset your storage directory, rerun this cell.\n", "\n", "# ===============================================================\n", "# Save variables to Jupiter's temp storage so we can access it even if the kernel restarts.\n", "%store free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae download_scripts activate_deepdanbooru" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Don't forget, there's a script to update this notebook to [the latest version](https://github.com/Engineer-of-Stuff/stable-diffusion-paperspace/blob/main/StableDiffusionUI_Voldemort_paperspace.ipynb) on GitHub.**" ] }, { "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "### Clone the central repository" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "sBbcB4vwj_jm", "tags": [] }, "outputs": [], "source": [ "import os\n", "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae activate_deepdanbooru\n", "%cd /notebooks/\n", "\n", "def delete_broken_symlinks(path):\n", " # make sure to pass this function a path without a trailing slash\n", " for file in os.listdir(path):\n", " if os.path.islink(f'{path}/{file}') and not os.path.exists(os.readlink(f'{path}/{file}')):\n", " print(f'Symlink broken, removing: {file}')\n", " os.unlink(f'{path}/{file}')\n", "\n", "def update_repo_if_not_exists(path, repo_clone_url, pre=None):\n", " if pre is not None:\n", " pre() \n", " if not os.path.exists(path):\n", " !git clone \"{repo_clone_url}\" \"{path}\"\n", " else:\n", " print(f'{repo_clone_url.split(\"/\")[-1]} already downloaded, updating...')\n", " !cd \"{path}\" && git pull # no % so we don't interfere with the main process\n", "\n", "def init_free():\n", " if (free_tier and repo_storage_dir != '/notebooks'):\n", " delete_broken_symlinks('/notebooks/') # remove broken symlinks since it might have been installed in a non-persistent directory\n", " if not os.path.exists(repo_storage_dir):\n", " !mkdir -p \"{repo_storage_dir}\"\n", " !ln -s \"{repo_storage_dir}\" /notebooks/\n", " !ls -la /notebooks/stable-diffusion\n", "update_repo_if_not_exists(f'{repo_storage_dir}/stable-diffusion-webui', 'https://github.com/AUTOMATIC1111/stable-diffusion-webui', init_free)" ] }, { "cell_type": "markdown", "metadata": { "id": "C68TUpkq0nj_", "tags": [] }, "source": [ "### Install requirements and download repositories" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "SaAJk33ppFw1", "scrolled": true, "tags": [] }, "outputs": [], "source": [ "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae download_scripts activate_deepdanbooru\n", "%cd \"{repo_storage_dir}/stable-diffusion-webui\"\n", "import os\n", "\n", "# Import launch.py which will automatically run the install script but not launch the WebUI.\n", "# They require a few specific external git repo commits so we have to do it their way. \n", "import launch\n", "launch.prepare_enviroment()\n", "\n", "# The installer isn't installing deepdanbooru right now so we'll do it manually\n", "if activate_deepdanbooru:\n", " !pip install \"git+https://github.com/KichangKim/DeepDanbooru.git@edf73df4cdaeea2cf00e9ac08bd8a9026b7a7b26#egg=deepdanbooru[tensorflow]\" # tensorflow==2.10.0 tensorflow-io==0.27.0 flatbuffers==1.12\n", "\n", "# latent-diffusion is a requirement but launch.py isn't downloading it so we'll do it manually.\n", "if not os.path.exists(f'{repo_storage_dir}/stable-diffusion-webui/repositories/latent-diffusion'):\n", " !git clone https://github.com/crowsonkb/k-diffusion.git \"{repo_storage_dir}/stable-diffusion-webui/repositories/k-diffusion\"\n", " !git clone https://github.com/Hafiidz/latent-diffusion.git \"{repo_storage_dir}/stable-diffusion-webui/repositories/latent-diffusion\"\n", "\n", "# Download the GFPGAN face restorer.\n", "if not os.path.exists(f'{repo_storage_dir}/stable-diffusion-webui/GFPGANv1.3.pth'):\n", " !wget https://github.com/TencentARC/GFPGAN/releases/download/v1.3.0/GFPGANv1.3.pth -O \"{repo_storage_dir}/stable-diffusion-webui/GFPGANv1.3.pth\"\n", "else:\n", " print('GFPGANv1.3.pth already downloaded')\n", "\n", "# Download popular custom scripts. This is basically remote code execution so be careful.\n", "# See https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Custom-Scripts\n", "if download_scripts:\n", " !pip install requests\n", " import shutil\n", " import requests\n", " !pip install moviepy==1.0.3\n", " !apt update\n", " !apt install -y potrace python3-tk\n", "\n", " def download_file_dir(url, output_dir):\n", " # output_dir must have a trailing slash\n", " local_filename = url.split('/')[-1]\n", " with requests.get(url, stream=True) as r:\n", " r.raise_for_status()\n", " with open(f'{output_dir}{local_filename}', 'wb') as f:\n", " for chunk in r.iter_content(chunk_size=8192):\n", " f.write(chunk)\n", " return local_filename\n", " def do_script_download(scripts_list, domain, path):\n", " for item in scripts_list:\n", " download_file_dir(f'https://{domain}/{item}', path)\n", " print(f'{item.split(\"/\")[-1]} downloaded...')\n", "\n", " do_script_download([\n", " 'GRMrGecko/stable-diffusion-webui-automatic/advanced_matrix/scripts/advanced_prompt_matrix.py',\n", " 'jtkelm2/stable-diffusion-webui-1/master/scripts/wildcards.py',\n", " 'dfaker/stable-diffusion-webui-cv2-external-masking-script/main/external_masking.py',\n", " 'memes-forever/Stable-diffusion-webui-video/main/videos.py',\n", " 'yownas/seed_travel/main/scripts/seed_travel.py',\n", " 'Animator-Anon/Animator/main/animation.py',\n", " 'Filarius/stable-diffusion-webui/master/scripts/vid2vid.py',\n", " 'GeorgLegato/Txt2Vectorgraphics/main/txt2vectorgfx.py',\n", " 'yownas/shift-attention/main/scripts/shift_attention.py',\n", " 'DiceOwl/StableDiffusionStuff/main/loopback_superimpose.py',\n", " 'Engineer-of-Stuff/stable-diffusion-paperspace/main/lfs/save_steps.py',\n", " 'Pfaeff/sd-web-ui-scripts/main/moisaic.py'\n", " ], 'raw.githubusercontent.com', f'{repo_storage_dir}/stable-diffusion-webui/scripts/')\n", "\n", " do_script_download([\n", " 'dfaker/f88aa62e3a14b559fe4e5f6b345db664/raw/791dabfa0ab26399aa2635bcbc1cf6267aa4ffc2/alternate_sampler_noise_schedules.py',\n", " 'camenduru/9ec5f8141db9902e375967e93250860f/raw/c1a03eb447548adbef1858c0e69d3567a390d2f4/run_n_times.py'\n", " ], 'gist.githubusercontent.com', f'{repo_storage_dir}/stable-diffusion-webui/scripts/')\n", "\n", " # Download and set up txt2img2img\n", " update_repo_if_not_exists(f'{repo_storage_dir}/stable-diffusion-webui/txt2img2img_root', 'https://github.com/ThereforeGames/txt2img2img.git')\n", " !cp -r \"{repo_storage_dir}/stable-diffusion-webui/txt2img2img_root/scripts\" \"{repo_storage_dir}/stable-diffusion-webui\"\n", " !cp -r \"{repo_storage_dir}/stable-diffusion-webui/txt2img2img_root/txt2img2img\" \"{repo_storage_dir}/stable-diffusion-webui\"\n", " !cp -r \"{repo_storage_dir}/stable-diffusion-webui/txt2img2img_root/venv\" \"{repo_storage_dir}/stable-diffusion-webui\"\n", "\n", " # Download and set up txt2mask\n", " update_repo_if_not_exists(f'{repo_storage_dir}/stable-diffusion-webui/txt2mask', 'https://github.com/ThereforeGames/txt2mask.git')\n", " !echo \"Copying txt2mask...\"\n", " !cp -r \"{repo_storage_dir}/stable-diffusion-webui/txt2mask/repositories/clipseg\" \"{repo_storage_dir}/stable-diffusion-webui/repositories\"\n", " !cp -r \"{repo_storage_dir}/stable-diffusion-webui/txt2mask/scripts/\" \"{repo_storage_dir}/stable-diffusion-webui/\"\n", "\n", " # Do the wildcard script\n", " !mkdir -p \"{repo_storage_dir}/stable-diffusion-webui/scripts/wildcards\"\n", " do_script_download([\n", " 'jtkelm2/stable-diffusion-webui-1/master/scripts/wildcards/adjective.txt',\n", " 'jtkelm2/stable-diffusion-webui-1/master/scripts/wildcards/artist.txt',\n", " 'jtkelm2/stable-diffusion-webui-1/master/scripts/wildcards/genre.txt',\n", " 'jtkelm2/stable-diffusion-webui-1/master/scripts/wildcards/site.txt',\n", " 'jtkelm2/stable-diffusion-webui-1/master/scripts/wildcards/style.txt'\n", " ], 'raw.githubusercontent.com', f'{repo_storage_dir}/stable-diffusion-webui/scripts/wildcards/')\n", "\n", "if activate_xformers:\n", " # Completly remove cuda from your system\n", " # The container comes with cuda 11.2 installed which is incompatable with pytorch\n", " # We MUST install cuda 11.3\n", " !apt update\n", " !apt purge -y cuda*\n", " !apt autoremove --purge -y\n", " !wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/cuda-ubuntu2004-keyring.gpg -O /usr/share/keyrings/cuda-archive-keyring.gpg\n", " !echo \"deb [signed-by=/usr/share/keyrings/cuda-archive-keyring.gpg] https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/ /\" | sudo tee /etc/apt/sources.list.d/cuda-ubuntu2004-x86_64.list\n", " # Reinstall the cuda packages we removed, but in version 11.3\n", " !apt update && apt install -y cuda-command-line-tools-11-3 cuda-compat-11-3 cuda-minimal-build-11-3 cuda-compiler-11-3 cuda-libraries-dev-11-3 cuda-cupti-dev-11-3 cuda-cupti-11-3 cuda-nvcc-11-3 cuda-cudart-dev-11-3 cuda-libraries-11-3 cuda-cudart-11-3 cuda-gdb-11-3 cuda-cuobjdump-11-3 cuda-cuxxfilt-11-3 cuda-driver-dev-11-3 cuda-memcheck-11-3 cuda-nvdisasm-11-3 cuda-nvml-dev-11-3 cuda-nvprof-11-3 cuda-nvprune-11-3 cuda-nvrtc-dev-11-3 cuda-nvrtc-11-3 cuda-nvtx-11-3 cuda-sanitizer-11-3\n", " # Set up pip packages\n", " !pip uninstall -y torch torchvision torchaudio # Remove existing pytorch install.\n", " !pip install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu113 # Install pytorch for cuda 11.3\n", " from subprocess import getoutput\n", " s = getoutput('nvidia-smi')\n", " if 'A4000' in s:\n", " !pip install https://raw.githubusercontent.com/Cyberes/xformers-compiled/main/A4000/xformers-0.0.14.dev0-cp39-cp39-linux_x86_64.whl\n", " elif 'P5000' in s:\n", " !pip install https://raw.githubusercontent.com/Cyberes/xformers-compiled/main/P5000/xformers-0.0.14.dev0-cp39-cp39-linux_x86_64.whl\n", " elif 'RTX 5000' in s:\n", " !pip install https://raw.githubusercontent.com/Cyberes/xformers-compiled/main/RTX%205000/xformers-0.0.14.dev0-cp39-cp39-linux_x86_64.whl\n", " elif 'RTX 4000' in s:\n", " !pip install https://raw.githubusercontent.com/Cyberes/xformers-compiled/main/RTX%204000/xformers-0.0.14.dev0-cp39-cp39-linux_x86_64.whl\n", " else:\n", " print('XFORMERS ERROR: no wheel built for your GPU! Xformers disabled.')\n", " activate_xformers = False\n", " %store activate_xformers\n", "\n", "# Make sure your models storage directory exists\n", "!mkdir -p \"{model_storage_dir}/hypernetworks\"\n", "!mkdir -p \"{repo_storage_dir}/stable-diffusion-webui/models/hypernetworks\"\n", "\n", "# Make sure the log dir exists\n", "\n", "# Link the output folders to /notebooks/outputs\n", "!mkdir -p \"{repo_storage_dir}/stable-diffusion-webui/log/images\"\n", "!mkdir -p \"{repo_storage_dir}/stable-diffusion-webui/outputs\"\n", "!ln -s \"{repo_storage_dir}/stable-diffusion-webui/outputs\" /notebooks/\n", "!ln -s \"{repo_storage_dir}/stable-diffusion-webui/log\" \"{repo_storage_dir}/stable-diffusion-webui/outputs\"\n", "\n", "\n", "!echo \"Done!\"" ] }, { "cell_type": "markdown", "metadata": { "id": "F0EINk5M0s-w", "tags": [] }, "source": [ "### Download the Model" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "I've provided a few different ways of aquiring the models. Try the torrent option first. You don't need to repeat this step if you've already downloaded the models.\n", "\n", "There are a few additional models available here: https://cyberes.github.io/stable-diffusion-models\n", "\n", "If you're interested in textual inversion, here's the database: https://cyberes.github.io/stable-diffusion-textual-inversion-models\n", "\n", "**Filesize and Storage Disclaimer**\n", "\n", "Paperspace free tier has only 5GB of storage space. If you're having storage issues, here's a few suggestions.\n", "1. Download everything to `/tmp/`\n", "2. Add a payment method to your account. Storage overages are billed at \\$0.29/GB and billing occurs monthly and runs at midnight on the first of each month. With a payment method on file, Paperspace will let you use more storage and if you time it right you shouldn't actually be charged for it.\n", "3. Upgrade to a Pro account. They'll give you 15GB and you'll get longer runtimes and more powerful free GPUs.\n", "4. Use my referral code `KQLRH37` You'll get \\$10 credit that you should be able to put towards the storage overage charges. Redeem the code at the bottom of the Billing page.\n", "\n", "If you're on free tier, only download one model.\n", "\n", "**Torrent Instructions**\n", "\n", "Aria2 may show some errors/warnings while downloading. Those are fine, when it eventually says \"Download Complete\" that means everything worked as it should." ] }, { "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "#### Standard Model" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Torrent**" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "tags": [] }, "outputs": [], "source": [ "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae\n", "!apt update\n", "!apt install -y aria2\n", "%cd $model_storage_dir\n", "!aria2c --seed-time=0 --max-overall-upload-limit=1K --bt-max-peers=120 --summary-interval=0 --file-allocation=none \"magnet:?xt=urn:btih:3A4A612D75ED088EA542ACAC52F9F45987488D1C&tr=udp://tracker.opentrackr.org:1337/announce\"" ] }, { "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "#### Waifu Diffusion\n", "\n", "The original Stable Diffusion anime finetune.\n", "\n", "**v1.3**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae\n", "!apt update\n", "!apt install -y aria2\n", "%cd \"{model_storage_dir}\"\n", "!aria2c --seed-time=0 --bt-max-peers=120 --summary-interval=0 --file-allocation=none \"magnet:?xt=urn:btih:AWJJJZNFOOK7R2XXXGZ4GFNKUEU2TSFP&dn=wd-v1-3-float16.ckpt&xl=2132889245&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**v1.2**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae\n", "!apt update\n", "!apt install -y aria2\n", "%cd \"{model_storage_dir}\"\n", "!aria2c --seed-time=0 --max-overall-upload-limit=1K --bt-max-peers=120 --summary-interval=0 --file-allocation=none \"magnet:?xt=urn:btih:153590FD7E93EE11D8DB951451056C362E3A9150&dn=wd-v1-2-full-ema-pruned.ckpt&tr=udp://tracker.opentrackr.org:1337\"" ] }, { "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "#### trinart_stable_diffusion_v2" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Another anime finetune. Designed to nudge SD to an anime/manga style. Seems to be more \"stylized\" and \"artistic\" than Waifu Diffusion, if that makes any sense.\n", "\n", "The 60,000 steps version is the original, the 115,000 and 95,000 versions is the 60,000 with additional training. Use the 60,000 step version if the style nudging is too much.\n", "\n", "[See the comparison here.](https://cyberes.github.io/stable-diffusion-models/#model-comparison)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**60000**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae\n", "!wget https://huggingface.co/naclbit/trinart_stable_diffusion_v2/resolve/main/trinart2_step60000.ckpt -O \"{model_storage_dir}/trinart2_step60000.ckpt\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**95000**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae\n", "!wget https://huggingface.co/naclbit/trinart_stable_diffusion_v2/resolve/main/trinart2_step95000.ckpt -O \"{model_storage_dir}/trinart2_step95000.ckpt\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**115000**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae\n", "!wget https://huggingface.co/naclbit/trinart_stable_diffusion_v2/resolve/main/trinart2_step115000.ckpt -O \"{model_storage_dir}/trinart2_step115000.ckpt\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### NovelAI Leak" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**animefull-final-pruned**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae\n", "!apt update\n", "!apt install -y aria2\n", "metalink = 'magnet:?xt=urn:btih:5bde442da86265b670a3e5ea3163afad2c6f8ecc&dn=novelaileak&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce&tr=udp%3A%2F%2F9.rarbg.com%3A2810%2Fannounce&tr=udp%3A%2F%2Ftracker.openbittorrent.com%3A6969%2Fannounce&tr=http%3A%2F%2Ftracker.openbittorrent.com%3A80%2Fannounce&tr=udp%3A%2F%2Fopentracker.i2p.rocks%3A6969%2Fannounce'\n", "import re\n", "infohash = re.search(\"^magnet:\\?xt=urn:btih:(.*?)&.*?$\", metalink).group(1)\n", "import subprocess\n", "tmp_dir = subprocess.check_output(['mktemp', '-d']).decode('ascii').strip('\\n')\n", "%cd \"{tmp_dir}\"\n", "# Have to convert the metalink to a torrent file so aria2c can read the files inside\n", "!aria2c -d . --bt-metadata-only=true --bt-save-metadata=true --bt-max-peers=120 --summary-interval=0 --file-allocation=none \"magnet:?xt=urn:btih:5bde442da86265b670a3e5ea3163afad2c6f8ecc&dn=novelaileak&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce&tr=udp%3A%2F%2F9.rarbg.com%3A2810%2Fannounce&tr=udp%3A%2F%2Ftracker.openbittorrent.com%3A6969%2Fannounce&tr=http%3A%2F%2Ftracker.openbittorrent.com%3A80%2Fannounce&tr=udp%3A%2F%2Fopentracker.i2p.rocks%3A6969%2Fannounce\"\n", "!aria2c --select-file=64,65 --seed-time=0 --max-overall-upload-limit=1K --bt-max-peers=120 --summary-interval=0 --file-allocation=none \"{infohash}.torrent\"\n", "!mv \"novelaileak/stableckpt/animefull-final-pruned/config.yaml\" \"{model_storage_dir}/novelai-animefull-final-pruned.yaml\"\n", "!mv \"novelaileak/stableckpt/animefull-final-pruned/model.ckpt\" \"{model_storage_dir}/novelai-animefull-final-pruned.ckpt\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**animesfw-final-pruned**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae\n", "!apt update\n", "!apt install -y aria2\n", "metalink = 'magnet:?xt=urn:btih:5bde442da86265b670a3e5ea3163afad2c6f8ecc&dn=novelaileak&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce&tr=udp%3A%2F%2F9.rarbg.com%3A2810%2Fannounce&tr=udp%3A%2F%2Ftracker.openbittorrent.com%3A6969%2Fannounce&tr=http%3A%2F%2Ftracker.openbittorrent.com%3A80%2Fannounce&tr=udp%3A%2F%2Fopentracker.i2p.rocks%3A6969%2Fannounce'\n", "import re\n", "infohash = re.search(\"^magnet:\\?xt=urn:btih:(.*?)&.*?$\", metalink).group(1)\n", "import subprocess\n", "tmp_dir = subprocess.check_output(['mktemp', '-d']).decode('ascii').strip('\\n')\n", "%cd \"{tmp_dir}\"\n", "# Have to convert the metalink to a torrent file so aria2c can read the files inside\n", "!aria2c -d . --bt-metadata-only=true --bt-save-metadata=true --bt-max-peers=120 --summary-interval=0 --file-allocation=none \"magnet:?xt=urn:btih:5bde442da86265b670a3e5ea3163afad2c6f8ecc&dn=novelaileak&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce&tr=udp%3A%2F%2F9.rarbg.com%3A2810%2Fannounce&tr=udp%3A%2F%2Ftracker.openbittorrent.com%3A6969%2Fannounce&tr=http%3A%2F%2Ftracker.openbittorrent.com%3A80%2Fannounce&tr=udp%3A%2F%2Fopentracker.i2p.rocks%3A6969%2Fannounce\"\n", "!aria2c --select-file=70,71 --seed-time=0 --max-overall-upload-limit=1K --bt-max-peers=120 --summary-interval=0 --file-allocation=none \"{infohash}.torrent\"\n", "!mv \"novelaileak/stableckpt/animesfw-final-pruned/config.yaml\" \"{model_storage_dir}/novelai-animesfw-final-pruned.yaml\"\n", "!mv \"novelaileak/stableckpt/animesfw-final-pruned/model.ckpt\" \"{model_storage_dir}/novelai-animesfw-final-pruned.ckpt\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Hypernetworks**\n", "\n", "A hypernetwork is trained much like a neural network and helps to guide the neural net towards the intended output." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae\n", "!apt update\n", "!apt install -y aria2\n", "metalink = 'magnet:?xt=urn:btih:5bde442da86265b670a3e5ea3163afad2c6f8ecc&dn=novelaileak&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce&tr=udp%3A%2F%2F9.rarbg.com%3A2810%2Fannounce&tr=udp%3A%2F%2Ftracker.openbittorrent.com%3A6969%2Fannounce&tr=http%3A%2F%2Ftracker.openbittorrent.com%3A80%2Fannounce&tr=udp%3A%2F%2Fopentracker.i2p.rocks%3A6969%2Fannounce'\n", "import re\n", "infohash = re.search(\"^magnet:\\?xt=urn:btih:(.*?)&.*?$\", metalink).group(1)\n", "import subprocess\n", "tmp_dir = subprocess.check_output(['mktemp', '-d']).decode('ascii').strip('\\n')\n", "%cd \"{tmp_dir}\"\n", "# Have to convert the metalink to a torrent file so aria2c can read the files inside\n", "!aria2c -d . --bt-metadata-only=true --bt-save-metadata=true --bt-max-peers=120 --summary-interval=0 --file-allocation=none \"magnet:?xt=urn:btih:5bde442da86265b670a3e5ea3163afad2c6f8ecc&dn=novelaileak&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce&tr=udp%3A%2F%2F9.rarbg.com%3A2810%2Fannounce&tr=udp%3A%2F%2Ftracker.openbittorrent.com%3A6969%2Fannounce&tr=http%3A%2F%2Ftracker.openbittorrent.com%3A80%2Fannounce&tr=udp%3A%2F%2Fopentracker.i2p.rocks%3A6969%2Fannounce\"\n", "!aria2c --select-file=76,81,82,83,84,85,86,87,88,89,90,91,92,93 --seed-time=0 --max-overall-upload-limit=1K --bt-max-peers=120 --summary-interval=0 --file-allocation=none \"{infohash}.torrent\"\n", "# -exec mv doesn't work with python variables so we'll set an environment variable instead\n", "import os\n", "os.environ[\"MODEL_STORAGE_DIR\"] = model_storage_dir\n", "!rm novelaileak/stableckpt/extra-sd-prune/sd-prune/anime700k-64bs-0.1ucg-penultimate-1epoch-clip-ema-continue-76000.pt # aria2 downloads this file even though I told it not to\n", "!find novelaileak/ -type f -name '*.pt' -exec mv {} \"$MODEL_STORAGE_DIR/hypernetworks\" \\;" ] }, { "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "### Clean up and restart the kernel" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae\n", "\n", "# Get some storage back\n", "!pip cache purge\n", "!cd \"{model_storage_dir}\" && rm *.aria2\n", "!apt remove --purge -y aria2 p7zip-full\n", "!apt autoremove --purge -y\n", "!apt clean\n", "!rm -rf /var/lib/apt/lists/*\n", "\n", "# Restart the kernel\n", "import os\n", "os.kill(os.getpid(), 9)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Link the models directory\n", "\n", "Create symlinks. The file will be stored in the models storage directory and linked to where the WebUI expects the files to be." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae\n", "import os\n", "import glob\n", "\n", "def delete_broken_symlinks(dir):\n", " deleted = False\n", " for file in os.listdir(dir):\n", " path = f'{dir}/{file}'\n", " if os.path.islink(path) and not os.path.exists(os.readlink(path)):\n", " print(f'Symlink broken, removing: {file}')\n", " os.unlink(path)\n", " deleted = True\n", " if deleted:\n", " print('')\n", " \n", "def symlink_models(source_dir, filetype, destination_dir):\n", " # don't pass directory paths with trailing slash\n", " for file in os.listdir(source_dir):\n", " if file.endswith(filetype):\n", " path = f'{destination_dir}/{file}'\n", " if not os.path.exists(path):\n", " print(f'New model: {file}')\n", " !ln -s \"{source_dir}/{file}\" \"{destination_dir}/{file}\"\n", " !ls -la --block-size=GB \"{destination_dir}/{file}\"\n", "\n", "# Check for broken symlinks and remove them\n", "delete_broken_symlinks(f'{repo_storage_dir}/stable-diffusion-webui/models/Stable-diffusion')\n", "delete_broken_symlinks(f'{repo_storage_dir}/stable-diffusion-webui/models/hypernetworks')\n", "\n", "# Link models\n", "symlink_models(model_storage_dir, 'ckpt', f'{repo_storage_dir}/stable-diffusion-webui/models/Stable-diffusion')\n", "\n", "# Link config yaml files\n", "symlink_models(model_storage_dir, 'yaml', f'{repo_storage_dir}/stable-diffusion-webui/models/Stable-diffusion')\n", "\n", "# Link hypernetworks\n", "symlink_models(f'{model_storage_dir}/hypernetworks', 'pt', f'{repo_storage_dir}/stable-diffusion-webui/models/hypernetworks')\n", "\n", "# Link the NovelAI files for each of the NovelAI models\n", "os.chdir(f'{model_storage_dir}')\n", "for model in glob.glob('novelai-*.ckpt'):\n", " yaml = model.replace('.ckpt', '.yaml')\n", " if os.path.exists(yaml) and not os.path.exists(f'{repo_storage_dir}/stable-diffusion-webui/models/Stable-diffusion/{yaml}'): \n", " print(f'New NovelAI model config: {yaml}')\n", " !ln -s \"{model_storage_dir}/{yaml}\" \"{repo_storage_dir}/stable-diffusion-webui/models/Stable-diffusion/{yaml}\"\n", " !ls -la --block-size=GB \"{repo_storage_dir}/stable-diffusion-webui/models/Stable-diffusion/{yaml}\"\n", "\n", "if link_novelai_anime_vae:\n", " for model in glob.glob('novelai-*.ckpt'):\n", " if os.path.exists(f'{model_storage_dir}/hypernetworks/animevae.pt'):\n", " vae = model.replace('.ckpt', '.vae.pt')\n", " if not os.path.exists(f'{repo_storage_dir}/stable-diffusion-webui/models/Stable-diffusion/{vae}'):\n", " print(f'Linking NovelAI {vae} and {model}')\n", " !ln -s \"{model_storage_dir}/hypernetworks/animevae.pt\" \"{repo_storage_dir}/stable-diffusion-webui/models/Stable-diffusion/{vae}\"\n", " !ls -la --block-size=GB \"{repo_storage_dir}/stable-diffusion-webui/models/Stable-diffusion/{vae}\"\n", " else:\n", " print(f'{model_storage_dir}/hypernetworks/animevae.pt NOT FOUND')" ] }, { "cell_type": "markdown", "metadata": { "id": "xt8lbdmC04ox" }, "source": [ "# Launch the WebUI" ] }, { "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "Run this block to launch the WebUI. You will get a link to nnn.gradio.app, that's your WebUI. Follow it.\n", "\n", "- See [webui.py](https://github.com/AUTOMATIC1111/stable-diffusion-webui/blob/master/modules/shared.py#L22) to view the code for the launch args. There's a lot of good info in there about exactly what the args do. If you aren't a programmer, [here's the wiki](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Run-with-Custom-Parameters).\n", "- If you have a lot of VRAM and desire high generation speeds, add `--disable-opt-split-attention` to disable VRAM optimizations.\n", "- If you have a decent about of VRAM and aren't generating large images you can remove `--medvram` for a speed boost.\n", "\n", "#### Troubleshooting\n", "- If you have any issues, try restarting the kernel.\n", "- `EOFError: Ran out of input` probably means you ran out of storage space and the model `.ckpt` file wasn't downloaded completely. Try cleaning up your files. There are some helpful scripts in the Tools section below.\n", "- If you're having issues with your results not loading, that's a known bug. I used to suggest people use ngrok but apparently accounts were getting locked because tunneling proxies are against the terms of service even though the Paperspace terms of service doesn't mention anything like that. [More details in the Tools section below.](https://github.com/anderspitman/awesome-tunneling)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "R-xAdMA5wxXd", "tags": [] }, "outputs": [], "source": [ "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers activate_deepdanbooru\n", "%cd \"{repo_storage_dir}/stable-diffusion-webui\"\n", "\n", "# Enable optional args automatically\n", "x_arg = '--xformers' if activate_xformers else ''\n", "dd_arg = '--deepdanbooru' if activate_deepdanbooru else ''\n", "\n", "# Launch args go below:\n", "!python webui.py {x_arg} {dd_arg} --gradio-debug --share --medvram # --gradio-auth me:password1234" ] }, { "cell_type": "markdown", "metadata": { "jp-MarkdownHeadingCollapsed": true, "tags": [] }, "source": [ "# Export Generations" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This block will rename and compress the outputs with 7zip max compression. It expects you to have `log/` and `outputs/` in `/notebooks/stable-diffusion-webui/`." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "scrolled": true, "tags": [] }, "outputs": [], "source": [ "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae\n", "!apt update\n", "!apt install -y p7zip-full\n", "from datetime import datetime\n", "datetime_str = datetime.now().strftime('%m-%d-%Y_%H:%M:%S')\n", "%cd /notebooks/\n", "!mkdir -p \"{datetime_str}/log\"\n", "!cd \"{repo_storage_dir}/stable-diffusion-webui/log/\" && mv * \"/notebooks/{datetime_str}/log\"\n", "!cd \"{repo_storage_dir}/stable-diffusion-webui/outputs/\" && mv * \"/notebooks/{datetime_str}\"\n", "!TEMP=\"/notebooks/{datetime_str}\" # find command has issues with ipynb variables??\n", "# !find $TEMP -name .ipynb_checkpoints -exec rm -rf \"{}\" +\n", "!7z a -t7z -m0=lzma2 -mx=9 -mfb=64 -md=32m -ms=on \"{datetime_str}.7z\" \"/notebooks/{datetime_str}/\"" ] }, { "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "### Delete old output folder\n", "\n", "This block will delete the folder you just compressed." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "!rm -rf \"/notebooks/{datetime_str}/\"\n", "!echo Deleted /notebooks/{datetime_str}/" ] }, { "cell_type": "markdown", "metadata": { "jp-MarkdownHeadingCollapsed": true, "tags": [] }, "source": [ "# Tools" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Show graphics card info" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "!nvidia-smi" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Download the latest version of this notebook from Github\n", "\n", "Run this and refresh the page (press F5). Don't save anything or you will overwrite the downloaded file." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "!mv /notebooks/StableDiffusionUI_Voldemort_paperspace.ipynb /notebooks/StableDiffusionUI_Voldemort_paperspace.ipynb.backup # save your old notebook to a backup\n", "!wget https://raw.githubusercontent.com/Engineer-of-Stuff/stable-diffusion-paperspace/main/StableDiffusionUI_Voldemort_paperspace.ipynb -O /notebooks/StableDiffusionUI_Voldemort_paperspace.ipynb\n", "!echo \"Downloaded! Now, refresh the page (press F5). Don't save anything or you will overwrite the downloaded file.\"" ] }, { "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "### Reset Repository\n", "\n", "Sometimes AUTOMATIC1111 breaks something. Go to https://github.com/AUTOMATIC1111/stable-diffusion-webui/commits/master and choose a commit to revert to.\n", "\n", "If you're looking for a specific date, do: `git log --since='Sept 17 2022' --until='Sept 18 2022'`\n", "\n", "\n", "**This shouldn't delete your outputs or any changes you've made to files, but I'd back up anything important just to be safe.**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "%store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae\n", "%cd \"{repo_storage_dir}/stable-diffusion-webui\"\n", "!git reset --hard " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Delete .ipynb_checkpoints\n", "\n", "Jupyter stores temporary files in folders named `.ipynb_checkpoints`. It gets a little excessive sometimes so if you're running low on storage space or getting weird errors about a directory named `.ipynb_checkpoints`, run this block." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "!find . -type d -name .ipynb_checkpoints -delete" ] }, { "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "### Reset storage\n", "\n", "This will delete ALL your files in `/notebooks/`, `/storage/`, `model_storage_dir`, and `repo_storage_dir`. Use if you're having issues with zero storage space and you don't want to delete your notebook." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Uncomment the lines below to run this block. You can highlight the lines and do ctrl + /\n", "# %store -r free_tier model_storage_dir repo_storage_dir activate_xformers link_novelai_anime_vae\n", "# !rm -rf /storage/*\n", "# !mv /notebooks/*.ipynb / # move the notebook out of the directory before we nuke it\n", "# !rm -rf /notebooks/*\n", "# !mv /*.ipynb /notebooks/ # move it back\n", "# !rm -rf {model_storage_dir}\n", "# !rm -rf {repo_storage_dir}" ] }, { "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "### Build and Install Xformers\n", "\n", "This is an advanced feature that should boost your generation speeds.\n", "\n", "1. Run the block below to download the install script to `/notebooks/`\n", "2. Go to https://developer.nvidia.com/cuda-gpus and find the Cuda arch for your GPU model. It's likely 7.5, but double check.\n", "3. Once you have read these instructions, uncomment the second line and insert your Cuda arch.\n", "4. Add `--xformers` to your launch args.\n", "\n", "If you have any issues, open the Jpyter Lab and run `build-xformers.sh` from the terminal." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "!wget https://raw.githubusercontent.com/Engineer-of-Stuff/stable-diffusion-paperspace/main/other/build-xformers.sh -O /notebooks/build-xformers.sh\n", "# !bash /notebooks/build-xformers.sh [your cuda arch]\n", "!echo \"COMPLETED!\"" ] } ], "metadata": { "accelerator": "GPU", "colab": { "collapsed_sections": [], "private_outputs": true, "provenance": [] }, "gpuClass": "standard", "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.9.13" } }, "nbformat": 4, "nbformat_minor": 4 }