Commit Graph

223 Commits

Author SHA1 Message Date
Cyberes 4deb32bf1c test 2023-10-04 10:32:11 -06:00
Cyberes 7e3af3599d test 2023-10-04 10:29:58 -06:00
Cyberes 4634e36eeb text 2023-10-04 10:26:39 -06:00
Cyberes b76e77a66a fix exception 2023-10-04 10:24:28 -06:00
Cyberes 5f4e4710c1 option to prioritize by parameter count 2023-10-04 10:19:44 -06:00
Cyberes 6dc3529190 show online status on stats page 2023-10-03 23:39:25 -06:00
Cyberes 1a7f22ec55 adjust again 2023-10-03 20:47:37 -06:00
Cyberes 67f5df9bb9 fix stats page 2023-10-03 20:42:53 -06:00
Cyberes f88e2362c5 remove some debug prints 2023-10-03 20:01:28 -06:00
Cyberes 33b4b8404b clean up streaming 2023-10-03 14:10:50 -06:00
Cyberes e16f415749 fix 2023-10-03 13:49:00 -06:00
Cyberes 581a0fec99 fix exception 2023-10-03 13:47:18 -06:00
Cyberes 32ad97e57c do default model rather than default backend, adjust moderation endpoint logic and add timeout, exclude system tokens from recent proompters, calculate number of moderators from endpoint concurrent gens, adjust homepage 2023-10-03 13:40:08 -06:00
Cyberes 63c12ea830 fix 2023-10-03 01:25:43 -06:00
Cyberes ca1baa4870 test 2023-10-03 00:15:16 -06:00
Cyberes 62eb0196cc t 2023-10-03 00:13:55 -06:00
Cyberes 0f5e22191c test 2023-10-03 00:12:37 -06:00
Cyberes 70126acdf2 test 2023-10-03 00:12:13 -06:00
Cyberes f6acd67738 t 2023-10-03 00:05:32 -06:00
Cyberes 07d6f6d8e9 test 2023-10-03 00:03:39 -06:00
Cyberes cd325216e2 test 2023-10-02 22:45:07 -06:00
Cyberes aed5db4968 trying to narrow down error 2023-10-02 21:43:36 -06:00
Cyberes 94141b8ecf fix processing not being decremented on streaming, fix confusion over queue, adjust stop sequences 2023-10-02 20:53:08 -06:00
Cyberes 4f226ae38e handle requests to offline backends 2023-10-02 11:11:48 -06:00
Cyberes b0089859d7 fix ratelimiting 2023-10-02 02:05:15 -06:00
Cyberes d1c4e68f8b fix openai models response 2023-10-01 23:07:49 -06:00
Cyberes 21da2f6373 fix openai error message 2023-10-01 22:58:08 -06:00
Cyberes a594729d00 fix keyerror 2023-10-01 22:37:13 -06:00
Cyberes 51881ae39d fix tokenizer 2023-10-01 17:19:34 -06:00
Cyberes f7e9687527 finish openai endpoints 2023-10-01 16:04:53 -06:00
Cyberes 2a3ff7e21e update openai endpoints 2023-10-01 14:15:01 -06:00
Cyberes 93d19fb95b fix exception 2023-10-01 10:25:32 -06:00
Cyberes d203973e80 fix routes 2023-10-01 01:13:13 -06:00
Cyberes 25ec56a5ef get streaming working, remove /v2/ 2023-10-01 00:20:00 -06:00
Cyberes b10d22ca0d cache the home page in the background 2023-09-30 23:03:42 -06:00
Cyberes bc25d92c95 reduce tokens for backend tester 2023-09-30 21:48:16 -06:00
Cyberes 9235725bdd adjust message 2023-09-30 21:35:55 -06:00
Cyberes 61856b4383 adjust message 2023-09-30 21:34:32 -06:00
Cyberes 7af3dbd76b add message about settings 2023-09-30 21:31:25 -06:00
Cyberes 592eb08cb1 add message for /v1/ 2023-09-30 21:07:12 -06:00
Cyberes 166b2316e8 depricate v1 2023-09-30 20:59:24 -06:00
Cyberes 1151bb5475 adjust stats 2023-09-30 20:42:48 -06:00
Cyberes e0f86d053a reorganize to api v2 2023-09-30 19:42:41 -06:00
Cyberes 114f36e709 functional 2023-09-30 19:41:50 -06:00
Cyberes 624ca74ce5 mvp 2023-09-29 00:09:44 -06:00
Cyberes e7b57cad7b set up cluster config and basic background workers 2023-09-28 18:40:24 -06:00
Cyberes e1d3fca6d3 try to cancel inference if disconnected from client 2023-09-28 09:55:31 -06:00
Cyberes e42f2b6819 fix negative queue on stats 2023-09-28 08:47:39 -06:00
Cyberes 347a82b7e1 avoid sending to backend to tokenize if it's greater than our specified context size 2023-09-28 03:54:20 -06:00
Cyberes 467b804ad7 raise printer interval 2023-09-28 03:47:27 -06:00