local-llm-server/llm_server/llm/vllm
Cyberes d9bbcc42e6 more work on openai endpoint 2023-09-26 22:09:11 -06:00
..
__init__.py port to mysql, use vllm tokenizer endpoint 2023-09-20 20:30:31 -06:00
generate.py convert to gunicorn 2023-09-26 13:32:33 -06:00
info.py adjust vllm info 2023-09-21 20:13:29 -06:00
tokenize.py more work on openai endpoint 2023-09-26 22:09:11 -06:00
vllm_backend.py fix background log not doing anything 2023-09-25 18:18:29 -06:00