Cyberes
|
5f4e4710c1
|
option to prioritize by parameter count
|
2023-10-04 10:19:44 -06:00 |
Cyberes
|
21da2f6373
|
fix openai error message
|
2023-10-01 22:58:08 -06:00 |
Cyberes
|
592eb08cb1
|
add message for /v1/
|
2023-09-30 21:07:12 -06:00 |
Cyberes
|
114f36e709
|
functional
|
2023-09-30 19:41:50 -06:00 |
Cyberes
|
624ca74ce5
|
mvp
|
2023-09-29 00:09:44 -06:00 |
Cyberes
|
e7b57cad7b
|
set up cluster config and basic background workers
|
2023-09-28 18:40:24 -06:00 |
Cyberes
|
e86a5182eb
|
redo background processes, reorganize server.py
|
2023-09-27 23:36:44 -06:00 |
Cyberes
|
35e9847b27
|
set inference workers to daemon, add finally to inference worker, hide estimated avg tps
|
2023-09-27 18:36:51 -06:00 |
Cyberes
|
aba2e5b9c0
|
don't use db pooling, add LLM-ST-Errors header to disable formatted errors
|
2023-09-26 23:59:22 -06:00 |
Cyberes
|
048e5a8060
|
fix API key handling
|
2023-09-26 22:49:53 -06:00 |
Cyberes
|
d9bbcc42e6
|
more work on openai endpoint
|
2023-09-26 22:09:11 -06:00 |
Cyberes
|
1646a00987
|
implement streaming on openai, improve streaming, run DB logging in background thread
|
2023-09-25 12:30:40 -06:00 |
Cyberes
|
8d6b2ce49c
|
minor changes, add admin token auth system, add route to get backend info
|
2023-09-24 15:54:35 -06:00 |
Cyberes
|
84a1fcfdd8
|
don't store host if it's an IP
|
2023-09-23 23:14:22 -06:00 |
Cyberes
|
76a1428ba0
|
implement streaming for vllm
|
2023-09-23 17:57:23 -06:00 |
Cyberes
|
03e3ec5490
|
port to mysql, use vllm tokenizer endpoint
|
2023-09-20 20:30:31 -06:00 |
Cyberes
|
9740df07c7
|
add openai-compatible backend
|
2023-09-12 16:40:09 -06:00 |
Cyberes
|
ba063f7f1b
|
caching
|
2023-08-23 12:40:13 -06:00 |
Cyberes
|
a9b7a7a2c7
|
display error messages in sillytavern
|
2023-08-22 20:28:41 -06:00 |
Cyberes
|
e04d6a8a13
|
minor adjustments
|
2023-08-21 22:49:44 -06:00 |
Cyberes
|
8cbf643fd3
|
MVP
|
2023-08-21 21:28:52 -06:00 |