This repository has been archived on 2024-10-27. You can view files and clone it, but cannot push or open issues or pull requests.
local-llm-server/llm_server/routes/v1
Cyberes 08df52a4fd fix exception when not valid model 2023-10-05 12:28:00 -06:00
..
__init__.py get streaming working, remove /v2/ 2023-10-01 00:20:00 -06:00
generate.py update openai endpoints 2023-10-01 14:15:01 -06:00
generate_stats.py fix import error 2023-10-04 16:29:19 -06:00
generate_stream.py fix exception when not valid model 2023-10-05 12:28:00 -06:00
info.py fix import error 2023-10-04 16:29:19 -06:00
proxy.py fix import error 2023-10-04 16:29:19 -06:00