This website requires JavaScript.
Explore
Gist
Help
Register
Sign In
cyberes
/
local-llm-server
Watch
1
Star
1
Fork
You've already forked local-llm-server
0
Code
Issues
Pull Requests
Packages
Projects
Releases
1
Wiki
Activity
e3c57d874a
local-llm-server
/
llm_server
/
routes
/
v1
History
Cyberes
e0af2ea9c5
convert to gunicorn
2023-09-26 13:32:33 -06:00
..
__init__.py
cache stats in background
2023-09-17 18:55:36 -06:00
generate.py
don't log entire request on failure
2023-09-26 12:32:19 -06:00
generate_stats.py
convert to gunicorn
2023-09-26 13:32:33 -06:00
generate_stream.py
update database, tokenizer handle null prompt, convert top_p to vllm on openai, actually validate prompt on streaming,
2023-09-25 22:32:48 -06:00
info.py
minor changes, add admin token auth system, add route to get backend info
2023-09-24 15:54:35 -06:00
proxy.py
fix division by 0, prettify /stats json, add js var to home
2023-09-16 17:37:43 -06:00