This website requires JavaScript.
Explore
Gist
Help
Register
Sign In
cyberes
/
local-llm-server
Watch
1
Star
1
Fork
You've already forked local-llm-server
0
Code
Issues
Pull Requests
Packages
Projects
Releases
1
Wiki
Activity
2d390e6268
local-llm-server
/
llm_server
/
llm
/
vllm
History
Cyberes
77edbe779c
actually validate prompt length lol
2023-09-14 18:31:13 -06:00
..
__init__.py
implement vllm backend
2023-09-11 20:47:19 -06:00
generate.py
option to disable streaming, improve timeout on requests to backend, fix error handling. reduce duplicate code, misc other cleanup
2023-09-14 14:05:50 -06:00
info.py
actually we don't want to emulate openai
2023-09-12 01:04:11 -06:00
vllm_backend.py
actually validate prompt length lol
2023-09-14 18:31:13 -06:00