local-llm-server/llm_server/routes
Cyberes 8c04238e04 disable stream for now 2023-08-30 19:58:59 -06:00
..
helpers caching 2023-08-23 12:40:13 -06:00
v1 disable stream for now 2023-08-30 19:58:59 -06:00
__init__.py show total output tokens on stats 2023-08-24 20:43:11 -06:00
cache.py add HF text-generation-inference backend 2023-08-29 13:46:41 -06:00
queue.py add HF text-generation-inference backend 2023-08-29 13:46:41 -06:00
request_handler.py refactor generation route 2023-08-30 18:53:26 -06:00
stats.py update readme 2023-08-24 12:19:59 -06:00