local-llm-server/llm_server/routes/v1
Cyberes e5fbc9545d add ratelimiting to websocket streaming endpoint, fix queue not decrementing IP requests, add console printer 2023-09-27 21:15:54 -06:00
..
__init__.py cache stats in background 2023-09-17 18:55:36 -06:00
generate.py unify error message handling 2023-09-27 14:48:47 -06:00
generate_stats.py clean up background threads 2023-09-27 19:39:04 -06:00
generate_stream.py add ratelimiting to websocket streaming endpoint, fix queue not decrementing IP requests, add console printer 2023-09-27 21:15:54 -06:00
info.py more work on openai endpoint 2023-09-26 22:09:11 -06:00
proxy.py more work on openai endpoint 2023-09-26 22:09:11 -06:00