local-llm-server/llm_server/routes/openai
Cyberes 0eb901cb52 don't log entire request on failure 2023-09-26 12:32:19 -06:00
..
__init__.py further align openai endpoint with expected responses 2023-09-24 21:45:30 -06:00
chat_completions.py don't log entire request on failure 2023-09-26 12:32:19 -06:00
info.py further align openai endpoint with expected responses 2023-09-24 21:45:30 -06:00
models.py fix homepage slowness, fix incorrect 24 hr prompters, fix redis wrapper, 2023-09-25 17:20:21 -06:00
simulated.py implement streaming on openai, improve streaming, run DB logging in background thread 2023-09-25 12:30:40 -06:00