An HTTP API to serve local LLM Models.
Go to file
Cyberes 02c07bbd53 pycarm deeleted import 2023-08-23 21:34:27 -06:00
config allow disabling ssl verification 2023-08-23 16:11:32 -06:00
llm_server pycarm deeleted import 2023-08-23 21:34:27 -06:00
other fix stats for real 2023-08-23 01:14:19 -06:00
.gitignore MVP 2023-08-21 21:28:52 -06:00
LICENSE Initial commit 2023-08-21 14:40:46 -06:00
README.md use redis caching 2023-08-21 23:59:50 -06:00
requirements.txt fix stats for real 2023-08-23 01:14:19 -06:00
server.py add estimated wait time and other time tracking stats 2023-08-23 21:33:52 -06:00

README.md

local-llm-server

A HTTP API to serve local LLM Models.

sudo apt install redis