This repository has been archived on 2024-10-27. You can view files and clone it, but cannot push or open issues or pull requests.
local-llm-server/llm_server/llm/vllm
Cyberes 31ab4188f1 fix issues with queue and streaming 2023-10-15 20:45:01 -06:00
..
__init__.py port to mysql, use vllm tokenizer endpoint 2023-09-20 20:30:31 -06:00
generate.py fix background logger, add gradio chat example 2023-10-04 19:24:47 -06:00
info.py functional 2023-09-30 19:41:50 -06:00
tokenize.py fix issues with queue and streaming 2023-10-15 20:45:01 -06:00
vllm_backend.py fix openai confusion 2023-10-11 12:50:20 -06:00