From a3786a288f220396c5fa0c9804dfc871732867fe Mon Sep 17 00:00:00 2001 From: AbdBarho Date: Sat, 26 Nov 2022 15:26:32 +0100 Subject: [PATCH] Updated Usage (markdown) --- Usage.md | 24 ++---------------------- 1 file changed, 2 insertions(+), 22 deletions(-) diff --git a/Usage.md b/Usage.md index a5c30ff..c77f1a5 100644 --- a/Usage.md +++ b/Usage.md @@ -24,7 +24,7 @@ By default: `--medvram` are given, which allow you to use this model on a 6GB GP ### Custom models -This also has support for custom models, put the weights in the folder `data/StableDiffusion`, you can then change the model from the settings tab. +Put the weights in the folder `data/StableDiffusion`, you can then change the model from the settings tab. ### General Config There is multiple files in `data/config/auto` such as `config.json` and `ui-config.json` which let you which contain additional config for the UI. @@ -34,16 +34,7 @@ put your scripts `data/config/auto/scripts` and restart the container ### Extensions -First, you have to add `--enable-insecure-extension-access` to your `CLI_ARGS` in your `docker-compose.override.yml`: -```yml -services: - auto: - environment: - # put whatever other flags you want - - CLI_ARGS=--enable-insecure-extension-access --allow-code --medvram --xformers -``` - -Then, put your extensions in `data/config/auto/extensions`, there is also the option to create a script `data/config/auto/startup.sh` which will be called on container startup, in case you want to install any additional dependencies for your extensions or anything else. +You can use the UI to install extensions, or, you can put your extensions in `data/config/auto/extensions`, there is also the option to create a script `data/config/auto/startup.sh` which will be called on container startup, in case you want to install any additional dependencies for your extensions or anything else. An example of your `startup.sh` might looks like this: ```sh @@ -58,16 +49,5 @@ done I maintain neither the UI nor the extension, I can't help you. - - # `auto-cpu` CPU instance of the above, some stuff might not work, use at your own risk. - - -# `hlky` -By default: `--optimized-turbo` is given, which allow you to use this model on a 6GB GPU. However, some features might not be available in the mode. [You can find the full list of cli arguments here.](https://github.com/sd-webui/stable-diffusion-webui/blob/2236e8b5854092054e2c30edc559006ace53bf96/scripts/webui.py) - - -# `lstein` - -This fork might require a preload to work, see [#72](https://github.com/AbdBarho/stable-diffusion-webui-docker/issues/72)