Commit Graph

209 Commits

Author SHA1 Message Date
Cyberes ca1baa4870 test 2023-10-03 00:15:16 -06:00
Cyberes 62eb0196cc t 2023-10-03 00:13:55 -06:00
Cyberes 0f5e22191c test 2023-10-03 00:12:37 -06:00
Cyberes 70126acdf2 test 2023-10-03 00:12:13 -06:00
Cyberes f6acd67738 t 2023-10-03 00:05:32 -06:00
Cyberes 07d6f6d8e9 test 2023-10-03 00:03:39 -06:00
Cyberes cd325216e2 test 2023-10-02 22:45:07 -06:00
Cyberes aed5db4968 trying to narrow down error 2023-10-02 21:43:36 -06:00
Cyberes 94141b8ecf fix processing not being decremented on streaming, fix confusion over queue, adjust stop sequences 2023-10-02 20:53:08 -06:00
Cyberes 4f226ae38e handle requests to offline backends 2023-10-02 11:11:48 -06:00
Cyberes b0089859d7 fix ratelimiting 2023-10-02 02:05:15 -06:00
Cyberes d1c4e68f8b fix openai models response 2023-10-01 23:07:49 -06:00
Cyberes 21da2f6373 fix openai error message 2023-10-01 22:58:08 -06:00
Cyberes a594729d00 fix keyerror 2023-10-01 22:37:13 -06:00
Cyberes 51881ae39d fix tokenizer 2023-10-01 17:19:34 -06:00
Cyberes f7e9687527 finish openai endpoints 2023-10-01 16:04:53 -06:00
Cyberes 2a3ff7e21e update openai endpoints 2023-10-01 14:15:01 -06:00
Cyberes 93d19fb95b fix exception 2023-10-01 10:25:32 -06:00
Cyberes d203973e80 fix routes 2023-10-01 01:13:13 -06:00
Cyberes 25ec56a5ef get streaming working, remove /v2/ 2023-10-01 00:20:00 -06:00
Cyberes b10d22ca0d cache the home page in the background 2023-09-30 23:03:42 -06:00
Cyberes bc25d92c95 reduce tokens for backend tester 2023-09-30 21:48:16 -06:00
Cyberes 9235725bdd adjust message 2023-09-30 21:35:55 -06:00
Cyberes 61856b4383 adjust message 2023-09-30 21:34:32 -06:00
Cyberes 7af3dbd76b add message about settings 2023-09-30 21:31:25 -06:00
Cyberes 592eb08cb1 add message for /v1/ 2023-09-30 21:07:12 -06:00
Cyberes 166b2316e8 depricate v1 2023-09-30 20:59:24 -06:00
Cyberes 1151bb5475 adjust stats 2023-09-30 20:42:48 -06:00
Cyberes e0f86d053a reorganize to api v2 2023-09-30 19:42:41 -06:00
Cyberes 114f36e709 functional 2023-09-30 19:41:50 -06:00
Cyberes 624ca74ce5 mvp 2023-09-29 00:09:44 -06:00
Cyberes e7b57cad7b set up cluster config and basic background workers 2023-09-28 18:40:24 -06:00
Cyberes e1d3fca6d3 try to cancel inference if disconnected from client 2023-09-28 09:55:31 -06:00
Cyberes e42f2b6819 fix negative queue on stats 2023-09-28 08:47:39 -06:00
Cyberes 347a82b7e1 avoid sending to backend to tokenize if it's greater than our specified context size 2023-09-28 03:54:20 -06:00
Cyberes 467b804ad7 raise printer interval 2023-09-28 03:47:27 -06:00
Cyberes 315d42bbc5 divide by 0??? 2023-09-28 03:46:01 -06:00
Cyberes 59f2aac8ad rewrite redis usage 2023-09-28 03:44:30 -06:00
Cyberes a4a1d6cce6 fix double logging 2023-09-28 01:34:15 -06:00
Cyberes ecdf819088 fix try/finally with continue, fix wrong subclass signature 2023-09-28 00:11:34 -06:00
Cyberes e86a5182eb redo background processes, reorganize server.py 2023-09-27 23:36:44 -06:00
Cyberes 097d614a35 fix duplicate logging from console printer thread 2023-09-27 21:28:25 -06:00
Cyberes e5fbc9545d add ratelimiting to websocket streaming endpoint, fix queue not decrementing IP requests, add console printer 2023-09-27 21:15:54 -06:00
Cyberes 43299b32ad clean up background threads 2023-09-27 19:39:04 -06:00
Cyberes 35e9847b27 set inference workers to daemon, add finally to inference worker, hide estimated avg tps 2023-09-27 18:36:51 -06:00
Cyberes 105b66d5e2 unify error message handling 2023-09-27 14:48:47 -06:00
Cyberes 957a6cd092 fix error handling 2023-09-27 14:36:49 -06:00
Cyberes aba2e5b9c0 don't use db pooling, add LLM-ST-Errors header to disable formatted errors 2023-09-26 23:59:22 -06:00
Cyberes 048e5a8060 fix API key handling 2023-09-26 22:49:53 -06:00
Cyberes d9bbcc42e6 more work on openai endpoint 2023-09-26 22:09:11 -06:00