This repository has been archived on 2024-10-27. You can view files and clone it, but cannot push or open issues or pull requests.
local-llm-server/llm_server/routes
Cyberes f9b9051bad update weighted_average_column_for_model to account for when there was an error reported, insert null for response tokens when error, correctly parse x-forwarded-for, correctly convert model reported by hf-textgen 2023-08-29 15:46:56 -06:00
..
helpers caching 2023-08-23 12:40:13 -06:00
v1 update weighted_average_column_for_model to account for when there was an error reported, insert null for response tokens when error, correctly parse x-forwarded-for, correctly convert model reported by hf-textgen 2023-08-29 15:46:56 -06:00
__init__.py show total output tokens on stats 2023-08-24 20:43:11 -06:00
cache.py add HF text-generation-inference backend 2023-08-29 13:46:41 -06:00
queue.py add HF text-generation-inference backend 2023-08-29 13:46:41 -06:00
stats.py update readme 2023-08-24 12:19:59 -06:00