local-llm-server/llm_server/routes/v1
Cyberes 7ee2311183 whats going on 2023-09-23 21:10:14 -06:00
..
__init__.py cache stats in background 2023-09-17 18:55:36 -06:00
generate.py add moderation endpoint to openai api, update config 2023-09-14 15:07:17 -06:00
generate_stats.py if there's less than num concurrent wait time is 0 2023-09-23 21:09:21 -06:00
generate_stream.py whats going on 2023-09-23 21:10:14 -06:00
info.py show the openai system prompt 2023-09-13 20:25:56 -06:00
proxy.py fix division by 0, prettify /stats json, add js var to home 2023-09-16 17:37:43 -06:00