local-llm-server/llm_server/routes/v1
Cyberes 43299b32ad clean up background threads 2023-09-27 19:39:04 -06:00
..
__init__.py cache stats in background 2023-09-17 18:55:36 -06:00
generate.py unify error message handling 2023-09-27 14:48:47 -06:00
generate_stats.py clean up background threads 2023-09-27 19:39:04 -06:00
generate_stream.py set inference workers to daemon, add finally to inference worker, hide estimated avg tps 2023-09-27 18:36:51 -06:00
info.py more work on openai endpoint 2023-09-26 22:09:11 -06:00
proxy.py more work on openai endpoint 2023-09-26 22:09:11 -06:00