Cyberes
|
be03569165
|
use backend handler to build parameters when sending test prompt
|
2023-10-17 12:42:48 -06:00 |
Cyberes
|
18e37a72ae
|
add model selection to openai endpoint
|
2023-10-09 23:51:26 -06:00 |
Cyberes
|
9b819573e8
|
fix import error
|
2023-10-05 18:39:31 -06:00 |
Cyberes
|
e07e31df0a
|
fix
|
2023-10-05 18:37:50 -06:00 |
Cyberes
|
1670594908
|
fix import error
|
2023-10-04 16:29:19 -06:00 |
Cyberes
|
5f4e4710c1
|
option to prioritize by parameter count
|
2023-10-04 10:19:44 -06:00 |
Cyberes
|
67f5df9bb9
|
fix stats page
|
2023-10-03 20:42:53 -06:00 |
Cyberes
|
32ad97e57c
|
do default model rather than default backend, adjust moderation endpoint logic and add timeout, exclude system tokens from recent proompters, calculate number of moderators from endpoint concurrent gens, adjust homepage
|
2023-10-03 13:40:08 -06:00 |
Cyberes
|
94141b8ecf
|
fix processing not being decremented on streaming, fix confusion over queue, adjust stop sequences
|
2023-10-02 20:53:08 -06:00 |
Cyberes
|
f7e9687527
|
finish openai endpoints
|
2023-10-01 16:04:53 -06:00 |
Cyberes
|
2a3ff7e21e
|
update openai endpoints
|
2023-10-01 14:15:01 -06:00 |
Cyberes
|
d203973e80
|
fix routes
|
2023-10-01 01:13:13 -06:00 |
Cyberes
|
25ec56a5ef
|
get streaming working, remove /v2/
|
2023-10-01 00:20:00 -06:00 |
Cyberes
|
bc25d92c95
|
reduce tokens for backend tester
|
2023-09-30 21:48:16 -06:00 |
Cyberes
|
1151bb5475
|
adjust stats
|
2023-09-30 20:42:48 -06:00 |
Cyberes
|
114f36e709
|
functional
|
2023-09-30 19:41:50 -06:00 |
Cyberes
|
624ca74ce5
|
mvp
|
2023-09-29 00:09:44 -06:00 |
Cyberes
|
e7b57cad7b
|
set up cluster config and basic background workers
|
2023-09-28 18:40:24 -06:00 |