Cyberes
|
563630547a
|
add robots.txt
|
2023-10-23 17:32:33 -06:00 |
Cyberes
|
0771c2325c
|
fix inference workers quitting when a backend is offline, start adding logging, improve tokenizer error handling
|
2023-10-23 17:24:20 -06:00 |
Cyberes
|
3cf73fec9b
|
fix a few exceptions when all backends go offline
|
2023-10-23 15:22:57 -06:00 |
Cyberes
|
d43f110a14
|
fix redis cycle and add no reset to daemon
|
2023-10-22 12:19:20 -06:00 |
Cyberes
|
e236e93a79
|
clean up a bit
|
2023-10-20 18:26:41 -06:00 |
Cyberes
|
f39e976b34
|
dameon printer: Calculate the queue size the same way it's done on the stats
|
2023-10-20 17:41:47 -06:00 |
Cyberes
|
763139c949
|
fix keyerror
|
2023-10-20 14:02:30 -06:00 |
Cyberes
|
e838f591aa
|
fix keyerror?
|
2023-10-20 14:00:24 -06:00 |
Cyberes
|
0abd4b94fb
|
track down keyerror
|
2023-10-20 13:57:20 -06:00 |
Cyberes
|
4f5b2dbecb
|
add tests
|
2023-10-20 12:14:34 -06:00 |
Cyberes
|
6e74ce7c28
|
fix old code in completions
|
2023-10-19 17:59:27 -06:00 |
Cyberes
|
b9566e9db7
|
docs and stuff
|
2023-10-18 09:23:54 -06:00 |
Cyberes
|
56a2ca464b
|
change print
|
2023-10-18 09:12:30 -06:00 |
Cyberes
|
50377eca22
|
track lag on get_ip_request_count()
|
2023-10-18 09:09:22 -06:00 |
Cyberes
|
92e4ecd8a1
|
refer to queue for tracking IP count rather than seperate value
|
2023-10-18 09:03:10 -06:00 |
Cyberes
|
be03569165
|
use backend handler to build parameters when sending test prompt
|
2023-10-17 12:42:48 -06:00 |
Cyberes
|
90adffaec8
|
test
|
2023-10-17 12:32:41 -06:00 |
Cyberes
|
4c2c164ce1
|
test
|
2023-10-17 12:29:12 -06:00 |
Cyberes
|
2fed87d340
|
remove timed-out items from queue
|
2023-10-17 11:46:39 -06:00 |
Cyberes
|
7998cfca87
|
cleanup
|
2023-10-16 23:47:34 -06:00 |
Cyberes
|
2ed0e01db6
|
background thread
|
2023-10-16 23:44:11 -06:00 |
Cyberes
|
6f65791795
|
adjust
|
2023-10-16 23:40:07 -06:00 |
Cyberes
|
9e3cbc9d2e
|
fix streaming slowdown?
|
2023-10-16 23:36:25 -06:00 |
Cyberes
|
c3c053e071
|
test
|
2023-10-16 23:29:17 -06:00 |
Cyberes
|
806e522d16
|
don't pickle streaming
|
2023-10-16 18:35:10 -06:00 |
Cyberes
|
81baf9616f
|
revert
|
2023-10-16 18:16:19 -06:00 |
Cyberes
|
21755450a3
|
test
|
2023-10-16 18:10:21 -06:00 |
Cyberes
|
1e68e10b62
|
fix GeneratorExit
|
2023-10-16 18:04:49 -06:00 |
Cyberes
|
20047fa0e4
|
2000 chunk size
|
2023-10-16 18:01:17 -06:00 |
Cyberes
|
19a193b792
|
increase tokenization chunk size
|
2023-10-16 17:59:21 -06:00 |
Cyberes
|
2c7773cc4f
|
get streaming working again
|
2023-10-16 16:22:52 -06:00 |
Cyberes
|
151b3e4769
|
begin streaming rewrite
|
2023-10-16 00:18:05 -06:00 |
Cyberes
|
24aab3cd93
|
fix streaming disabled
|
2023-10-15 20:59:11 -06:00 |
Cyberes
|
381bdb950f
|
remove debug print
|
2023-10-15 20:46:32 -06:00 |
Cyberes
|
31ab4188f1
|
fix issues with queue and streaming
|
2023-10-15 20:45:01 -06:00 |
Cyberes
|
3ec9b2347f
|
fix wrong datatype
|
2023-10-15 17:24:18 -06:00 |
Cyberes
|
b3f0c4b28f
|
remove debug print
|
2023-10-15 15:14:32 -06:00 |
Cyberes
|
83f3ba8919
|
trying to fix workers still processing after backend goes offline
|
2023-10-15 15:11:37 -06:00 |
Cyberes
|
4e3985e156
|
fix wrong status code on openai streaming
|
2023-10-11 18:17:02 -06:00 |
Cyberes
|
74cf8f309b
|
clean up
|
2023-10-11 18:04:15 -06:00 |
Cyberes
|
69b8c1e35c
|
fix openai confusion
|
2023-10-11 12:50:20 -06:00 |
Cyberes
|
1d1c45dc1a
|
add length penalty param to vllm
|
2023-10-11 12:22:50 -06:00 |
Cyberes
|
78114771b0
|
fix oai exception
|
2023-10-11 09:20:00 -06:00 |
Cyberes
|
f4e5b5275d
|
test
|
2023-10-11 09:09:41 -06:00 |
Cyberes
|
18e37a72ae
|
add model selection to openai endpoint
|
2023-10-09 23:51:26 -06:00 |
Cyberes
|
5f7bf4faca
|
misc changes
|
2023-10-09 18:12:12 -06:00 |
Cyberes
|
ae4d4e5ca9
|
fix exception
|
2023-10-09 10:31:35 -06:00 |
Cyberes
|
467e1893ea
|
fix issue with null data on openai
|
2023-10-08 19:36:12 -06:00 |
Cyberes
|
3e5feb9c97
|
fix stat
|
2023-10-05 21:43:49 -06:00 |
Cyberes
|
e8964fcfd2
|
fix the queue??
|
2023-10-05 21:37:18 -06:00 |