This website requires JavaScript.
Explore
Gist
Help
Register
Sign In
cyberes
/
local-llm-server
Archived
Watch
1
Star
1
Fork
You've already forked local-llm-server
0
Code
Issues
Pull Requests
Packages
Projects
Releases
1
Wiki
Activity
This repository has been archived on
2024-10-27
. You can view files and clone it, but cannot push or open issues or pull requests.
ca7044bc90
local-llm-server
/
llm_server
/
routes
/
v1
History
Cyberes
18e37a72ae
add model selection to openai endpoint
2023-10-09 23:51:26 -06:00
..
__init__.py
get streaming working, remove /v2/
2023-10-01 00:20:00 -06:00
generate.py
add model selection to openai endpoint
2023-10-09 23:51:26 -06:00
generate_stats.py
fix import error
2023-10-04 16:29:19 -06:00
generate_stream.py
fix the queue??
2023-10-05 21:37:18 -06:00
info.py
fix import error
2023-10-04 16:29:19 -06:00
proxy.py
fix import error
2023-10-04 16:29:19 -06:00