update readme
This commit is contained in:
parent
f7743ade89
commit
bf1842f434
49
README.md
49
README.md
|
@ -2,4 +2,51 @@
|
||||||
|
|
||||||
_A HTTP API to serve local LLM Models._
|
_A HTTP API to serve local LLM Models._
|
||||||
|
|
||||||
`sudo apt install redis`
|
|
||||||
|
|
||||||
|
The purpose of this server is to abstract your LLM backend from your frontend API. This enables you to make changes to (or even switch) your backend without affecting your clients.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
### Install
|
||||||
|
|
||||||
|
1. `sudo apt install redis`
|
||||||
|
2. `python3 -m venv venv`
|
||||||
|
3. `source venv/bin/activate`
|
||||||
|
4. `pip install -r requirements.txt`
|
||||||
|
5. `python3 server.py`
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
An example systemctl service file is provided in `other/local-llm.service`.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
### Configure
|
||||||
|
|
||||||
|
First, set up your LLM backend. Currently, only [oobabooga/text-generation-webui](https://github.com/oobabooga/text-generation-webui) is supported, but eventually [huggingface/text-generation-inference](https://github.com/huggingface/text-generation-inference) will be the default.
|
||||||
|
|
||||||
|
Then, configure this server. The config file is located at `config/config.yml`.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
1. Set `backend_url` to the base API URL of your backend.
|
||||||
|
2. Set `token_limit` to the configured token limit of the backend. This number is shown to clients and on the home page.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
To set up token auth, add rows to the `token_auth` table in the SQLite database.
|
||||||
|
|
||||||
|
`token`: the token/password.
|
||||||
|
|
||||||
|
`type`: the type of token. Currently unused (maybe for a future web interface?) but required.
|
||||||
|
|
||||||
|
`priority`: the lower this value, the higher the priority. Higher priority tokens are bumped up in the queue line.
|
||||||
|
|
||||||
|
`uses`: how many responses this token has generated. Leave empty.
|
||||||
|
|
||||||
|
`max_uses`: how many responses this token is allowed to generate. Leave empty to leave unrestricted.
|
||||||
|
|
||||||
|
`expire`: UNIX timestamp of when this token expires and is not longer valid.
|
||||||
|
|
||||||
|
`disabled`: mark the token as disabled.
|
|
@ -1,14 +1,19 @@
|
||||||
# TODO: add this file to gitignore and add a .sample.yml
|
# TODO: add this file to gitignore and add a .sample.yml
|
||||||
|
|
||||||
|
## Important
|
||||||
|
|
||||||
|
backend_url: https://10.0.0.86:8083
|
||||||
|
|
||||||
|
mode: oobabooga
|
||||||
|
concurrent_gens: 3
|
||||||
|
token_limit: 7777
|
||||||
|
|
||||||
|
## Optional
|
||||||
|
|
||||||
log_prompts: true
|
log_prompts: true
|
||||||
verify_ssl: false # Python request has issues with self-signed certs
|
verify_ssl: false # Python request has issues with self-signed certs
|
||||||
|
|
||||||
mode: oobabooga
|
|
||||||
auth_required: false
|
auth_required: false
|
||||||
concurrent_gens: 3
|
|
||||||
token_limit: 7777
|
|
||||||
|
|
||||||
backend_url: https://10.0.0.86:8083
|
|
||||||
|
|
||||||
llm_middleware_name: proxy.chub-archive.evulid.cc
|
llm_middleware_name: proxy.chub-archive.evulid.cc
|
||||||
analytics_tracking_code: |
|
analytics_tracking_code: |
|
||||||
|
|
|
@ -16,6 +16,15 @@
|
||||||
display: inline-block;
|
display: inline-block;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
a, a:visited {
|
||||||
|
color: blue;
|
||||||
|
}
|
||||||
|
|
||||||
|
.footer {
|
||||||
|
font-size: 7pt;
|
||||||
|
text-align: center;
|
||||||
|
}
|
||||||
|
|
||||||
@media only screen and (max-width: 600px) {
|
@media only screen and (max-width: 600px) {
|
||||||
.container {
|
.container {
|
||||||
padding: 1em;
|
padding: 1em;
|
||||||
|
@ -53,6 +62,9 @@
|
||||||
|
|
||||||
<pre id="json">{{ stats_json|safe }}</pre>
|
<pre id="json">{{ stats_json|safe }}</pre>
|
||||||
</div>
|
</div>
|
||||||
|
<div class="footer">
|
||||||
|
<a href="https://git.evulid.cc/cyberes/local-llm-server" target="_blank">git.evulid.cc/cyberes/local-llm-server</a>
|
||||||
|
</div>
|
||||||
</body>
|
</body>
|
||||||
|
|
||||||
</html>
|
</html>
|
||||||
|
|
Reference in New Issue