cluster
|
add model selection to openai endpoint
|
2023-10-09 23:51:26 -06:00 |
config
|
fix the queue??
|
2023-10-05 21:37:18 -06:00 |
database
|
get streaming working again
|
2023-10-16 16:22:52 -06:00 |
llm
|
revert
|
2023-10-16 18:16:19 -06:00 |
pages
|
update current model when we generate_stats()
|
2023-08-24 21:10:00 -06:00 |
routes
|
fix streaming slowdown?
|
2023-10-16 23:36:25 -06:00 |
workers
|
cleanup
|
2023-10-16 23:47:34 -06:00 |
__init__.py
|
MVP
|
2023-08-21 21:28:52 -06:00 |
custom_redis.py
|
get streaming working again
|
2023-10-16 16:22:52 -06:00 |
helpers.py
|
add length penalty param to vllm
|
2023-10-11 12:22:50 -06:00 |
opts.py
|
don't pickle streaming
|
2023-10-16 18:35:10 -06:00 |
pre_fork.py
|
functional
|
2023-09-30 19:41:50 -06:00 |
sock.py
|
get streaming working again
|
2023-10-16 16:22:52 -06:00 |