Commit Graph

35 Commits

Author SHA1 Message Date
Cyberes 9a1d41a9b7 get functional again 2024-07-07 15:05:35 -06:00
Cyberes 20366fbd08 misc adjustments 2024-05-07 22:56:36 -06:00
Cyberes fe23a2282f refactor, add Llm-Disable-Openai header 2024-05-07 17:41:53 -06:00
Cyberes 5bd1044fad openai error message cleanup 2024-05-07 17:07:34 -06:00
Cyberes fd09c783d3 refactor a lot of things, major cleanup, use postgresql 2024-05-07 17:03:41 -06:00
Cyberes ee9a0d4858 redo config 2024-05-07 12:20:53 -06:00
Cyberes ff82add09e redo database connection, add pooling, minor logging changes, other clean up 2024-05-07 09:48:51 -06:00
Cyberes 0059e7956c Merge cluster to master (#3)
Co-authored-by: Cyberes <cyberes@evulid.cc>
Reviewed-on: #3
2023-10-27 19:19:22 -06:00
Cyberes a4a1d6cce6 fix double logging 2023-09-28 01:34:15 -06:00
Cyberes ecdf819088 fix try/finally with continue, fix wrong subclass signature 2023-09-28 00:11:34 -06:00
Cyberes e86a5182eb redo background processes, reorganize server.py 2023-09-27 23:36:44 -06:00
Cyberes 105b66d5e2 unify error message handling 2023-09-27 14:48:47 -06:00
Cyberes 957a6cd092 fix error handling 2023-09-27 14:36:49 -06:00
Cyberes aba2e5b9c0 don't use db pooling, add LLM-ST-Errors header to disable formatted errors 2023-09-26 23:59:22 -06:00
Cyberes d9bbcc42e6 more work on openai endpoint 2023-09-26 22:09:11 -06:00
Cyberes e0af2ea9c5 convert to gunicorn 2023-09-26 13:32:33 -06:00
Cyberes bbdb9c9d55 try to prevent "### XXX" responses on openai 2023-09-25 23:14:35 -06:00
Cyberes 11e84db59c update database, tokenizer handle null prompt, convert top_p to vllm on openai, actually validate prompt on streaming, 2023-09-25 22:32:48 -06:00
Cyberes 2d299dbae5 openai_force_no_hashes 2023-09-25 22:01:57 -06:00
Cyberes 135bd743bb fix homepage slowness, fix incorrect 24 hr prompters, fix redis wrapper, 2023-09-25 17:20:21 -06:00
Cyberes 1646a00987 implement streaming on openai, improve streaming, run DB logging in background thread 2023-09-25 12:30:40 -06:00
Cyberes bbe5d5a8fe improve openai endpoint, exclude system tokens more places 2023-09-25 09:32:23 -06:00
Cyberes 320f51e01c further align openai endpoint with expected responses 2023-09-24 21:45:30 -06:00
Cyberes cb99c3490e rewrite tokenizer, restructure validation 2023-09-24 13:02:30 -06:00
Cyberes 03e3ec5490 port to mysql, use vllm tokenizer endpoint 2023-09-20 20:30:31 -06:00
Cyberes edf13db324 calculate estimateed wate time better 2023-09-17 18:33:57 -06:00
Cyberes 7434ae1b5b openai: improve moderation checking 2023-09-17 17:40:05 -06:00
Cyberes 3100b0a924 set up queue to work with gunicorn processes, other improvements 2023-09-14 17:38:20 -06:00
Cyberes 5d03f875cb adjust prompt 2023-09-14 15:43:04 -06:00
Cyberes 1cf4c95ba2 ah, oops 2023-09-14 15:14:59 -06:00
Cyberes a89295193f add moderation endpoint to openai api, update config 2023-09-14 15:07:17 -06:00
Cyberes 79b1e01b61 option to disable streaming, improve timeout on requests to backend, fix error handling. reduce duplicate code, misc other cleanup 2023-09-14 14:05:50 -06:00
Cyberes 05a45e6ac6 didnt test anything 2023-09-13 11:51:46 -06:00
Cyberes bcedd2ab3d adjust logging, add more vllm stuff 2023-09-13 11:22:33 -06:00
Cyberes 9740df07c7 add openai-compatible backend 2023-09-12 16:40:09 -06:00