From 4c5ce81b314722bf9eb09c4396c2520e299ebc77 Mon Sep 17 00:00:00 2001 From: Victor Hall Date: Sat, 6 May 2023 00:54:47 -0400 Subject: [PATCH] Update OPTIMIZER.md --- doc/OPTIMIZER.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/OPTIMIZER.md b/doc/OPTIMIZER.md index 0738c9c..892ad07 100644 --- a/doc/OPTIMIZER.md +++ b/doc/OPTIMIZER.md @@ -16,7 +16,7 @@ If you do not set `optimizer_config` at all or set it to `null` in train.json, t ## Optimizers -In `optimizer.json` the you can set independent optimizer settings for both the text encoder and unet. If you want shared settings, just fill out the `unet` section and leave `text_encoder` properties null an they will be copied from the `unet` section, only respecting `text_encoder_lr_scale` if set. +In `optimizer.json` the you can set independent optimizer settings for both the text encoder and unet. If you want shared settings, just fill out the `base` section and leave `text_encoder_overrides` properties null an they will be copied from the `base` section. If you set the `text_encder_lr_scale` property, the text encoder will be trained with a multiple of the Unet learning rate if it the LR is being copied. If you explicitly set the text encoder LR, the `text_encder_lr_scale` is ignored. `text_encder_lr_scale` is likely to be deprecated in the future, but now is left for backwards compatibility.