Cyberes
|
77db34a6a7
|
g
|
2023-10-04 12:59:19 -06:00 |
Cyberes
|
364b795268
|
fix
|
2023-10-04 12:57:11 -06:00 |
Cyberes
|
7cb624c5f5
|
f
|
2023-10-04 12:47:59 -06:00 |
Cyberes
|
6af5365015
|
c
|
2023-10-04 12:45:20 -06:00 |
Cyberes
|
f3a13fcda8
|
c
|
2023-10-04 12:44:33 -06:00 |
Cyberes
|
a15b5465df
|
c
|
2023-10-04 12:44:09 -06:00 |
Cyberes
|
95d781725e
|
t
|
2023-10-04 12:42:18 -06:00 |
Cyberes
|
1b21cb69c1
|
test
|
2023-10-04 12:40:29 -06:00 |
Cyberes
|
4deb32bf1c
|
test
|
2023-10-04 10:32:11 -06:00 |
Cyberes
|
7e3af3599d
|
test
|
2023-10-04 10:29:58 -06:00 |
Cyberes
|
4634e36eeb
|
text
|
2023-10-04 10:26:39 -06:00 |
Cyberes
|
b76e77a66a
|
fix exception
|
2023-10-04 10:24:28 -06:00 |
Cyberes
|
5f4e4710c1
|
option to prioritize by parameter count
|
2023-10-04 10:19:44 -06:00 |
Cyberes
|
6dc3529190
|
show online status on stats page
|
2023-10-03 23:39:25 -06:00 |
Cyberes
|
1a7f22ec55
|
adjust again
|
2023-10-03 20:47:37 -06:00 |
Cyberes
|
67f5df9bb9
|
fix stats page
|
2023-10-03 20:42:53 -06:00 |
Cyberes
|
33b4b8404b
|
clean up streaming
|
2023-10-03 14:10:50 -06:00 |
Cyberes
|
e16f415749
|
fix
|
2023-10-03 13:49:00 -06:00 |
Cyberes
|
581a0fec99
|
fix exception
|
2023-10-03 13:47:18 -06:00 |
Cyberes
|
32ad97e57c
|
do default model rather than default backend, adjust moderation endpoint logic and add timeout, exclude system tokens from recent proompters, calculate number of moderators from endpoint concurrent gens, adjust homepage
|
2023-10-03 13:40:08 -06:00 |
Cyberes
|
63c12ea830
|
fix
|
2023-10-03 01:25:43 -06:00 |
Cyberes
|
f6acd67738
|
t
|
2023-10-03 00:05:32 -06:00 |
Cyberes
|
07d6f6d8e9
|
test
|
2023-10-03 00:03:39 -06:00 |
Cyberes
|
cd325216e2
|
test
|
2023-10-02 22:45:07 -06:00 |
Cyberes
|
94141b8ecf
|
fix processing not being decremented on streaming, fix confusion over queue, adjust stop sequences
|
2023-10-02 20:53:08 -06:00 |
Cyberes
|
b0089859d7
|
fix ratelimiting
|
2023-10-02 02:05:15 -06:00 |
Cyberes
|
d1c4e68f8b
|
fix openai models response
|
2023-10-01 23:07:49 -06:00 |
Cyberes
|
21da2f6373
|
fix openai error message
|
2023-10-01 22:58:08 -06:00 |
Cyberes
|
f7e9687527
|
finish openai endpoints
|
2023-10-01 16:04:53 -06:00 |
Cyberes
|
2a3ff7e21e
|
update openai endpoints
|
2023-10-01 14:15:01 -06:00 |
Cyberes
|
93d19fb95b
|
fix exception
|
2023-10-01 10:25:32 -06:00 |
Cyberes
|
d203973e80
|
fix routes
|
2023-10-01 01:13:13 -06:00 |
Cyberes
|
25ec56a5ef
|
get streaming working, remove /v2/
|
2023-10-01 00:20:00 -06:00 |
Cyberes
|
b10d22ca0d
|
cache the home page in the background
|
2023-09-30 23:03:42 -06:00 |
Cyberes
|
9235725bdd
|
adjust message
|
2023-09-30 21:35:55 -06:00 |
Cyberes
|
61856b4383
|
adjust message
|
2023-09-30 21:34:32 -06:00 |
Cyberes
|
7af3dbd76b
|
add message about settings
|
2023-09-30 21:31:25 -06:00 |
Cyberes
|
592eb08cb1
|
add message for /v1/
|
2023-09-30 21:07:12 -06:00 |
Cyberes
|
166b2316e8
|
depricate v1
|
2023-09-30 20:59:24 -06:00 |
Cyberes
|
1151bb5475
|
adjust stats
|
2023-09-30 20:42:48 -06:00 |
Cyberes
|
e0f86d053a
|
reorganize to api v2
|
2023-09-30 19:42:41 -06:00 |
Cyberes
|
114f36e709
|
functional
|
2023-09-30 19:41:50 -06:00 |
Cyberes
|
624ca74ce5
|
mvp
|
2023-09-29 00:09:44 -06:00 |
Cyberes
|
e7b57cad7b
|
set up cluster config and basic background workers
|
2023-09-28 18:40:24 -06:00 |
Cyberes
|
e1d3fca6d3
|
try to cancel inference if disconnected from client
|
2023-09-28 09:55:31 -06:00 |
Cyberes
|
e42f2b6819
|
fix negative queue on stats
|
2023-09-28 08:47:39 -06:00 |
Cyberes
|
347a82b7e1
|
avoid sending to backend to tokenize if it's greater than our specified context size
|
2023-09-28 03:54:20 -06:00 |
Cyberes
|
59f2aac8ad
|
rewrite redis usage
|
2023-09-28 03:44:30 -06:00 |
Cyberes
|
a4a1d6cce6
|
fix double logging
|
2023-09-28 01:34:15 -06:00 |
Cyberes
|
ecdf819088
|
fix try/finally with continue, fix wrong subclass signature
|
2023-09-28 00:11:34 -06:00 |