local-llm-server/llm_server/routes/v1
Cyberes a4a1d6cce6 fix double logging 2023-09-28 01:34:15 -06:00
..
__init__.py cache stats in background 2023-09-17 18:55:36 -06:00
generate.py unify error message handling 2023-09-27 14:48:47 -06:00
generate_stats.py clean up background threads 2023-09-27 19:39:04 -06:00
generate_stream.py fix double logging 2023-09-28 01:34:15 -06:00
info.py more work on openai endpoint 2023-09-26 22:09:11 -06:00
proxy.py more work on openai endpoint 2023-09-26 22:09:11 -06:00