local-llm-server/llm_server/routes/v1
Cyberes 3c1254d3bf cache stats in background 2023-09-17 18:55:36 -06:00
..
__init__.py cache stats in background 2023-09-17 18:55:36 -06:00
generate.py add moderation endpoint to openai api, update config 2023-09-14 15:07:17 -06:00
generate_stats.py cache stats in background 2023-09-17 18:55:36 -06:00
generate_stream.py option to disable streaming, improve timeout on requests to backend, fix error handling. reduce duplicate code, misc other cleanup 2023-09-14 14:05:50 -06:00
info.py show the openai system prompt 2023-09-13 20:25:56 -06:00
proxy.py fix division by 0, prettify /stats json, add js var to home 2023-09-16 17:37:43 -06:00