local-llm-server/llm_server/llm/vllm
Cyberes e0af2ea9c5 convert to gunicorn 2023-09-26 13:32:33 -06:00
..
__init__.py port to mysql, use vllm tokenizer endpoint 2023-09-20 20:30:31 -06:00
generate.py convert to gunicorn 2023-09-26 13:32:33 -06:00
info.py adjust vllm info 2023-09-21 20:13:29 -06:00
tokenize.py update database, tokenizer handle null prompt, convert top_p to vllm on openai, actually validate prompt on streaming, 2023-09-25 22:32:48 -06:00
vllm_backend.py fix background log not doing anything 2023-09-25 18:18:29 -06:00