local-llm-server/llm_server/routes/openai
Cyberes 3c1254d3bf cache stats in background 2023-09-17 18:55:36 -06:00
..
__init__.py cache stats in background 2023-09-17 18:55:36 -06:00
chat_completions.py set up queue to work with gunicorn processes, other improvements 2023-09-14 17:38:20 -06:00
info.py show the openai system prompt 2023-09-13 20:25:56 -06:00
models.py add openai-compatible backend 2023-09-12 16:40:09 -06:00