Cyberes
|
e1d3fca6d3
|
try to cancel inference if disconnected from client
|
2023-09-28 09:55:31 -06:00 |
Cyberes
|
e42f2b6819
|
fix negative queue on stats
|
2023-09-28 08:47:39 -06:00 |
Cyberes
|
347a82b7e1
|
avoid sending to backend to tokenize if it's greater than our specified context size
|
2023-09-28 03:54:20 -06:00 |
Cyberes
|
59f2aac8ad
|
rewrite redis usage
|
2023-09-28 03:44:30 -06:00 |
Cyberes
|
a4a1d6cce6
|
fix double logging
|
2023-09-28 01:34:15 -06:00 |
Cyberes
|
e5fbc9545d
|
add ratelimiting to websocket streaming endpoint, fix queue not decrementing IP requests, add console printer
|
2023-09-27 21:15:54 -06:00 |
Cyberes
|
43299b32ad
|
clean up background threads
|
2023-09-27 19:39:04 -06:00 |
Cyberes
|
35e9847b27
|
set inference workers to daemon, add finally to inference worker, hide estimated avg tps
|
2023-09-27 18:36:51 -06:00 |
Cyberes
|
105b66d5e2
|
unify error message handling
|
2023-09-27 14:48:47 -06:00 |
Cyberes
|
aba2e5b9c0
|
don't use db pooling, add LLM-ST-Errors header to disable formatted errors
|
2023-09-26 23:59:22 -06:00 |
Cyberes
|
048e5a8060
|
fix API key handling
|
2023-09-26 22:49:53 -06:00 |
Cyberes
|
d9bbcc42e6
|
more work on openai endpoint
|
2023-09-26 22:09:11 -06:00 |
Cyberes
|
e0af2ea9c5
|
convert to gunicorn
|
2023-09-26 13:32:33 -06:00 |
Cyberes
|
0eb901cb52
|
don't log entire request on failure
|
2023-09-26 12:32:19 -06:00 |
Cyberes
|
11e84db59c
|
update database, tokenizer handle null prompt, convert top_p to vllm on openai, actually validate prompt on streaming,
|
2023-09-25 22:32:48 -06:00 |
Cyberes
|
8240a1ebbb
|
fix background log not doing anything
|
2023-09-25 18:18:29 -06:00 |
Cyberes
|
8184e24bff
|
fix sending error messages when streaming
|
2023-09-25 17:37:58 -06:00 |
Cyberes
|
7ce60079d7
|
fix typo
|
2023-09-25 17:24:51 -06:00 |
Cyberes
|
30282479a0
|
fix flask exception
|
2023-09-25 17:22:28 -06:00 |
Cyberes
|
135bd743bb
|
fix homepage slowness, fix incorrect 24 hr prompters, fix redis wrapper,
|
2023-09-25 17:20:21 -06:00 |
Cyberes
|
52e6965b5e
|
don't count SYSTEM tokens for recent prompters, fix sql exclude for SYSTEM tokens
|
2023-09-25 13:00:39 -06:00 |
Cyberes
|
3eaabc8c35
|
fix copied code
|
2023-09-25 12:38:02 -06:00 |
Cyberes
|
1646a00987
|
implement streaming on openai, improve streaming, run DB logging in background thread
|
2023-09-25 12:30:40 -06:00 |
Cyberes
|
6459a1c91b
|
allow setting simultaneous IP limit per-token, fix token use tracker, fix tokens on streaming
|
2023-09-25 00:55:20 -06:00 |
Cyberes
|
320f51e01c
|
further align openai endpoint with expected responses
|
2023-09-24 21:45:30 -06:00 |
Cyberes
|
8d6b2ce49c
|
minor changes, add admin token auth system, add route to get backend info
|
2023-09-24 15:54:35 -06:00 |
Cyberes
|
2678102153
|
handle error while streaming
|
2023-09-24 13:27:27 -06:00 |
Cyberes
|
0015e653b2
|
adjust a few final things
|
2023-09-23 22:30:59 -06:00 |
Cyberes
|
fab7b7ccdd
|
active gen workers wait
|
2023-09-23 21:17:13 -06:00 |
Cyberes
|
7ee2311183
|
whats going on
|
2023-09-23 21:10:14 -06:00 |
Cyberes
|
94e845cd1a
|
if there's less than num concurrent wait time is 0
|
2023-09-23 21:09:21 -06:00 |
Cyberes
|
41e622d19c
|
fix two exceptions
|
2023-09-23 20:55:49 -06:00 |
Cyberes
|
f67ac8175b
|
fix wrong approach for streaming
|
2023-09-23 18:44:07 -06:00 |
Cyberes
|
8a4de7df44
|
oops
|
2023-09-23 18:01:12 -06:00 |
Cyberes
|
76a1428ba0
|
implement streaming for vllm
|
2023-09-23 17:57:23 -06:00 |
Cyberes
|
f9a80f3028
|
change proompters 1 min to 5 min
|
2023-09-20 21:21:22 -06:00 |
Cyberes
|
03e3ec5490
|
port to mysql, use vllm tokenizer endpoint
|
2023-09-20 20:30:31 -06:00 |
Cyberes
|
2d390e6268
|
*blushes* oopsie daisy
|
2023-09-17 20:22:17 -06:00 |
Cyberes
|
eb3179cfff
|
fix recent proompters to work with gunicorn
|
2023-09-17 19:06:53 -06:00 |
Cyberes
|
3c1254d3bf
|
cache stats in background
|
2023-09-17 18:55:36 -06:00 |
Cyberes
|
edf13db324
|
calculate estimateed wate time better
|
2023-09-17 18:33:57 -06:00 |
Cyberes
|
354ad8192d
|
fix division by 0, prettify /stats json, add js var to home
|
2023-09-16 17:37:43 -06:00 |
Cyberes
|
a89295193f
|
add moderation endpoint to openai api, update config
|
2023-09-14 15:07:17 -06:00 |
Cyberes
|
8f4f17166e
|
adjust
|
2023-09-14 14:36:22 -06:00 |
Cyberes
|
93a344f4c5
|
check if the backend crapped out, print some more stuff
|
2023-09-14 14:26:25 -06:00 |
Cyberes
|
79b1e01b61
|
option to disable streaming, improve timeout on requests to backend, fix error handling. reduce duplicate code, misc other cleanup
|
2023-09-14 14:05:50 -06:00 |
Cyberes
|
e79b206e1a
|
rename average_tps to estimated_avg_tps
|
2023-09-14 01:35:25 -06:00 |
Cyberes
|
12e894032e
|
show the openai system prompt
|
2023-09-13 20:25:56 -06:00 |
Cyberes
|
9740df07c7
|
add openai-compatible backend
|
2023-09-12 16:40:09 -06:00 |
Cyberes
|
1d9f40765e
|
remove text-generation-inference backend
|
2023-09-12 13:09:47 -06:00 |