This website requires JavaScript.
Explore
Gist
Help
Register
Sign In
cyberes
/
local-llm-server
Archived
Watch
1
Star
1
Fork
You've already forked local-llm-server
0
Code
Issues
Pull Requests
Packages
Projects
Releases
1
Wiki
Activity
This repository has been archived on
2024-10-27
. You can view files and clone it, but cannot push or open issues or pull requests.
ee83608a52
local-llm-server
/
llm_server
/
routes
/
v1
History
Cyberes
e1d3fca6d3
try to cancel inference if disconnected from client
2023-09-28 09:55:31 -06:00
..
__init__.py
cache stats in background
2023-09-17 18:55:36 -06:00
generate.py
unify error message handling
2023-09-27 14:48:47 -06:00
generate_stats.py
avoid sending to backend to tokenize if it's greater than our specified context size
2023-09-28 03:54:20 -06:00
generate_stream.py
try to cancel inference if disconnected from client
2023-09-28 09:55:31 -06:00
info.py
more work on openai endpoint
2023-09-26 22:09:11 -06:00
proxy.py
more work on openai endpoint
2023-09-26 22:09:11 -06:00