An HTTP API to serve local LLM Models.
Go to file
Cyberes f767e4b076 stats: prompters 1 min 2023-08-21 23:48:06 -06:00
config minor adjustments 2023-08-21 22:49:44 -06:00
llm_server stats: prompters 1 min 2023-08-21 23:48:06 -06:00
other add systemctl service 2023-08-21 23:25:53 -06:00
.gitignore MVP 2023-08-21 21:28:52 -06:00
LICENSE Initial commit 2023-08-21 14:40:46 -06:00
README.md MVP 2023-08-21 21:28:52 -06:00
requirements.txt fix relative paths for db path 2023-08-21 23:07:12 -06:00
server.py fix relative paths for db path 2023-08-21 23:07:12 -06:00

README.md

local-llm-server

A HTTP API to serve local LLM Models.