Commit Graph

284 Commits

Author SHA1 Message Date
Cyberes 23cb4df7db Update 'other/vllm/Docker/Dockerfile' 2023-10-01 02:08:16 -06:00
Cyberes ac43cb0f5a Update 'other/vllm/Docker/README.md' 2023-09-30 22:41:11 -06:00
Cyberes fdbc3feae4 Update 'other/vllm/Docker/start-container.sh' 2023-09-30 22:19:04 -06:00
Cyberes 3e651e64d2 Update 'other/vllm/Docker/start-container.sh' 2023-09-30 19:24:20 -06:00
Cyberes 428123e9f1 Update 'other/vllm/Docker/start-container.sh' 2023-09-30 19:11:26 -06:00
Cyberes ee83608a52 Update 'other/vllm/Docker/start-container.sh' 2023-09-30 19:10:10 -06:00
Cyberes 79fef7bc5a Update 'other/vllm/Docker/Dockerfile' 2023-09-30 19:04:31 -06:00
Cyberes c65b722211 Update 'other/vllm/Docker/DOCKER.md' 2023-09-30 13:32:46 -06:00
Cyberes d3f529ca8b Upload files to 'other/vllm/Docker' 2023-09-30 13:25:10 -06:00
Cyberes c047df0dc0 Update 'other/vllm/Docker/start-container.sh' 2023-09-30 13:24:47 -06:00
Cyberes d8ac9dc042 Update 'other/vllm/Docker/supervisord.conf' 2023-09-30 13:00:01 -06:00
Cyberes 2e998344d6 Update 'other/vllm/vllm_api_server.py' 2023-09-29 22:36:03 -06:00
Cyberes c888f5c789 update docker 2023-09-29 22:28:38 -06:00
Cyberes 89e9f42663 remove secrets from dockerfile, use /storage instead 2023-09-28 17:02:45 -06:00
Cyberes e1d3fca6d3 try to cancel inference if disconnected from client 2023-09-28 09:55:31 -06:00
Cyberes e42f2b6819 fix negative queue on stats 2023-09-28 08:47:39 -06:00
Cyberes 347a82b7e1 avoid sending to backend to tokenize if it's greater than our specified context size 2023-09-28 03:54:20 -06:00
Cyberes 467b804ad7 raise printer interval 2023-09-28 03:47:27 -06:00
Cyberes 315d42bbc5 divide by 0??? 2023-09-28 03:46:01 -06:00
Cyberes 59f2aac8ad rewrite redis usage 2023-09-28 03:44:30 -06:00
Cyberes a4a1d6cce6 fix double logging 2023-09-28 01:34:15 -06:00
Cyberes ecdf819088 fix try/finally with continue, fix wrong subclass signature 2023-09-28 00:11:34 -06:00
Cyberes 3a538d649a fix docker typo lol 2023-09-28 00:02:41 -06:00
Cyberes e86a5182eb redo background processes, reorganize server.py 2023-09-27 23:36:44 -06:00
Cyberes 097d614a35 fix duplicate logging from console printer thread 2023-09-27 21:28:25 -06:00
Cyberes adc0905c6f fix imports 2023-09-27 21:20:08 -06:00
Cyberes e5fbc9545d add ratelimiting to websocket streaming endpoint, fix queue not decrementing IP requests, add console printer 2023-09-27 21:15:54 -06:00
Cyberes 43299b32ad clean up background threads 2023-09-27 19:39:04 -06:00
Cyberes 35e9847b27 set inference workers to daemon, add finally to inference worker, hide estimated avg tps 2023-09-27 18:36:51 -06:00
Cyberes abef9eba7d adjust dockerfile paths 2023-09-27 17:50:55 -06:00
Cyberes 1874e6f7c4 fix docker logging 2023-09-27 17:00:46 -06:00
Cyberes ffb7af8f3c rename docker service 2023-09-27 16:58:49 -06:00
Cyberes 3b0ec723a5 fix docker 2023-09-27 16:57:14 -06:00
Cyberes 74f16afa67 update dockerfile 2023-09-27 16:12:36 -06:00
Cyberes eade509947 improve dockerfile 2023-09-27 14:59:33 -06:00
Cyberes 105b66d5e2 unify error message handling 2023-09-27 14:48:47 -06:00
Cyberes 957a6cd092 fix error handling 2023-09-27 14:36:49 -06:00
Cyberes 90bb68115f asdjust docker 2023-09-27 00:04:37 -06:00
Cyberes aba2e5b9c0 don't use db pooling, add LLM-ST-Errors header to disable formatted errors 2023-09-26 23:59:22 -06:00
Cyberes 7456bbe085 adjust docker 2023-09-26 23:23:54 -06:00
Cyberes 048e5a8060 fix API key handling 2023-09-26 22:49:53 -06:00
Cyberes d9bbcc42e6 more work on openai endpoint 2023-09-26 22:09:11 -06:00
Cyberes 9e6624e779 modify dockerfile for paperspace 2023-09-26 21:45:13 -06:00
Cyberes e3c57d874a add vllm dockerfile 2023-09-26 14:48:34 -06:00
Cyberes e0af2ea9c5 convert to gunicorn 2023-09-26 13:32:33 -06:00
Cyberes 0eb901cb52 don't log entire request on failure 2023-09-26 12:32:19 -06:00
Cyberes b44dda7a3a option to show SYSTEM tokens in stats 2023-09-25 23:39:50 -06:00
Cyberes e37cde5d48 exclude system token more places 2023-09-25 23:22:16 -06:00
Cyberes bbdb9c9d55 try to prevent "### XXX" responses on openai 2023-09-25 23:14:35 -06:00
Cyberes 11e84db59c update database, tokenizer handle null prompt, convert top_p to vllm on openai, actually validate prompt on streaming, 2023-09-25 22:32:48 -06:00