cluster
|
add model selection to openai endpoint
|
2023-10-09 23:51:26 -06:00 |
config
|
fix the queue??
|
2023-10-05 21:37:18 -06:00 |
database
|
get streaming working again
|
2023-10-16 16:22:52 -06:00 |
llm
|
revert
|
2023-10-16 18:16:19 -06:00 |
pages
|
update current model when we generate_stats()
|
2023-08-24 21:10:00 -06:00 |
routes
|
fix GeneratorExit
|
2023-10-16 18:04:49 -06:00 |
workers
|
get streaming working again
|
2023-10-16 16:22:52 -06:00 |
__init__.py
|
MVP
|
2023-08-21 21:28:52 -06:00 |
custom_redis.py
|
get streaming working again
|
2023-10-16 16:22:52 -06:00 |
helpers.py
|
add length penalty param to vllm
|
2023-10-11 12:22:50 -06:00 |
opts.py
|
fix streaming?
|
2023-10-05 20:14:28 -06:00 |
pre_fork.py
|
functional
|
2023-09-30 19:41:50 -06:00 |
sock.py
|
get streaming working again
|
2023-10-16 16:22:52 -06:00 |