Cyberes
|
f88e2362c5
|
remove some debug prints
|
2023-10-03 20:01:28 -06:00 |
Cyberes
|
62eb0196cc
|
t
|
2023-10-03 00:13:55 -06:00 |
Cyberes
|
0f5e22191c
|
test
|
2023-10-03 00:12:37 -06:00 |
Cyberes
|
cd325216e2
|
test
|
2023-10-02 22:45:07 -06:00 |
Cyberes
|
94141b8ecf
|
fix processing not being decremented on streaming, fix confusion over queue, adjust stop sequences
|
2023-10-02 20:53:08 -06:00 |
Cyberes
|
21da2f6373
|
fix openai error message
|
2023-10-01 22:58:08 -06:00 |
Cyberes
|
a594729d00
|
fix keyerror
|
2023-10-01 22:37:13 -06:00 |
Cyberes
|
51881ae39d
|
fix tokenizer
|
2023-10-01 17:19:34 -06:00 |
Cyberes
|
f7e9687527
|
finish openai endpoints
|
2023-10-01 16:04:53 -06:00 |
Cyberes
|
2a3ff7e21e
|
update openai endpoints
|
2023-10-01 14:15:01 -06:00 |
Cyberes
|
624ca74ce5
|
mvp
|
2023-09-29 00:09:44 -06:00 |
Cyberes
|
e7b57cad7b
|
set up cluster config and basic background workers
|
2023-09-28 18:40:24 -06:00 |
Cyberes
|
aba2e5b9c0
|
don't use db pooling, add LLM-ST-Errors header to disable formatted errors
|
2023-09-26 23:59:22 -06:00 |
Cyberes
|
d9bbcc42e6
|
more work on openai endpoint
|
2023-09-26 22:09:11 -06:00 |