This website requires JavaScript.
Explore
Gist
Help
Register
Sign In
cyberes
/
local-llm-server
Watch
1
Star
1
Fork
You've already forked local-llm-server
0
Code
Issues
Pull Requests
Packages
Projects
Releases
1
Wiki
Activity
8184e24bff
local-llm-server
/
llm_server
/
routes
/
v1
History
Cyberes
8184e24bff
fix sending error messages when streaming
2023-09-25 17:37:58 -06:00
..
__init__.py
cache stats in background
2023-09-17 18:55:36 -06:00
generate.py
further align openai endpoint with expected responses
2023-09-24 21:45:30 -06:00
generate_stats.py
fix typo
2023-09-25 17:24:51 -06:00
generate_stream.py
fix sending error messages when streaming
2023-09-25 17:37:58 -06:00
info.py
minor changes, add admin token auth system, add route to get backend info
2023-09-24 15:54:35 -06:00
proxy.py
fix division by 0, prettify /stats json, add js var to home
2023-09-16 17:37:43 -06:00