This website requires JavaScript.
Explore
Gist
Help
Register
Sign In
cyberes
/
local-llm-server
Watch
1
Star
1
Fork
You've already forked local-llm-server
0
Code
Issues
Pull Requests
Packages
Projects
Releases
1
Wiki
Activity
bbe5d5a8fe
local-llm-server
/
llm_server
/
routes
/
v1
History
Cyberes
6459a1c91b
allow setting simultaneous IP limit per-token, fix token use tracker, fix tokens on streaming
2023-09-25 00:55:20 -06:00
..
__init__.py
cache stats in background
2023-09-17 18:55:36 -06:00
generate.py
further align openai endpoint with expected responses
2023-09-24 21:45:30 -06:00
generate_stats.py
minor changes, add admin token auth system, add route to get backend info
2023-09-24 15:54:35 -06:00
generate_stream.py
allow setting simultaneous IP limit per-token, fix token use tracker, fix tokens on streaming
2023-09-25 00:55:20 -06:00
info.py
minor changes, add admin token auth system, add route to get backend info
2023-09-24 15:54:35 -06:00
proxy.py
fix division by 0, prettify /stats json, add js var to home
2023-09-16 17:37:43 -06:00