This repository has been archived on 2024-10-27. You can view files and clone it, but cannot push or open issues or pull requests.
local-llm-server/llm_server/llm/vllm
Cyberes 1646a00987 implement streaming on openai, improve streaming, run DB logging in background thread 2023-09-25 12:30:40 -06:00
..
__init__.py port to mysql, use vllm tokenizer endpoint 2023-09-20 20:30:31 -06:00
generate.py rewrite tokenizer, restructure validation 2023-09-24 13:02:30 -06:00
info.py adjust vllm info 2023-09-21 20:13:29 -06:00
tokenize.py adjust vllm info 2023-09-21 20:13:29 -06:00
vllm_backend.py implement streaming on openai, improve streaming, run DB logging in background thread 2023-09-25 12:30:40 -06:00