Cyberes
|
e86a5182eb
|
redo background processes, reorganize server.py
|
2023-09-27 23:36:44 -06:00 |
Cyberes
|
35e9847b27
|
set inference workers to daemon, add finally to inference worker, hide estimated avg tps
|
2023-09-27 18:36:51 -06:00 |
Cyberes
|
aba2e5b9c0
|
don't use db pooling, add LLM-ST-Errors header to disable formatted errors
|
2023-09-26 23:59:22 -06:00 |
Cyberes
|
048e5a8060
|
fix API key handling
|
2023-09-26 22:49:53 -06:00 |
Cyberes
|
d9bbcc42e6
|
more work on openai endpoint
|
2023-09-26 22:09:11 -06:00 |
Cyberes
|
1646a00987
|
implement streaming on openai, improve streaming, run DB logging in background thread
|
2023-09-25 12:30:40 -06:00 |
Cyberes
|
76a1428ba0
|
implement streaming for vllm
|
2023-09-23 17:57:23 -06:00 |
Cyberes
|
03e3ec5490
|
port to mysql, use vllm tokenizer endpoint
|
2023-09-20 20:30:31 -06:00 |
Cyberes
|
9740df07c7
|
add openai-compatible backend
|
2023-09-12 16:40:09 -06:00 |
Cyberes
|
ba063f7f1b
|
caching
|
2023-08-23 12:40:13 -06:00 |
Cyberes
|
e04d6a8a13
|
minor adjustments
|
2023-08-21 22:49:44 -06:00 |
Cyberes
|
8cbf643fd3
|
MVP
|
2023-08-21 21:28:52 -06:00 |