Cyberes
|
0718f10eb9
|
t
|
2023-10-05 18:36:02 -06:00 |
Cyberes
|
fb8bc05b4c
|
t
|
2023-10-05 18:30:54 -06:00 |
Cyberes
|
acf409abfc
|
fix background logger, add gradio chat example
|
2023-10-04 19:24:47 -06:00 |
Cyberes
|
1670594908
|
fix import error
|
2023-10-04 16:29:19 -06:00 |
Cyberes
|
94141b8ecf
|
fix processing not being decremented on streaming, fix confusion over queue, adjust stop sequences
|
2023-10-02 20:53:08 -06:00 |
Cyberes
|
d1c4e68f8b
|
fix openai models response
|
2023-10-01 23:07:49 -06:00 |
Cyberes
|
21da2f6373
|
fix openai error message
|
2023-10-01 22:58:08 -06:00 |
Cyberes
|
f7e9687527
|
finish openai endpoints
|
2023-10-01 16:04:53 -06:00 |
Cyberes
|
2a3ff7e21e
|
update openai endpoints
|
2023-10-01 14:15:01 -06:00 |
Cyberes
|
624ca74ce5
|
mvp
|
2023-09-29 00:09:44 -06:00 |
Cyberes
|
e7b57cad7b
|
set up cluster config and basic background workers
|
2023-09-28 18:40:24 -06:00 |
Cyberes
|
105b66d5e2
|
unify error message handling
|
2023-09-27 14:48:47 -06:00 |
Cyberes
|
aba2e5b9c0
|
don't use db pooling, add LLM-ST-Errors header to disable formatted errors
|
2023-09-26 23:59:22 -06:00 |
Cyberes
|
d9bbcc42e6
|
more work on openai endpoint
|
2023-09-26 22:09:11 -06:00 |
Cyberes
|
e0af2ea9c5
|
convert to gunicorn
|
2023-09-26 13:32:33 -06:00 |
Cyberes
|
0eb901cb52
|
don't log entire request on failure
|
2023-09-26 12:32:19 -06:00 |
Cyberes
|
bbdb9c9d55
|
try to prevent "### XXX" responses on openai
|
2023-09-25 23:14:35 -06:00 |
Cyberes
|
2d299dbae5
|
openai_force_no_hashes
|
2023-09-25 22:01:57 -06:00 |
Cyberes
|
8240a1ebbb
|
fix background log not doing anything
|
2023-09-25 18:18:29 -06:00 |
Cyberes
|
135bd743bb
|
fix homepage slowness, fix incorrect 24 hr prompters, fix redis wrapper,
|
2023-09-25 17:20:21 -06:00 |
Cyberes
|
3eaabc8c35
|
fix copied code
|
2023-09-25 12:38:02 -06:00 |
Cyberes
|
44e692c9cf
|
remove debug print
|
2023-09-25 12:35:36 -06:00 |
Cyberes
|
1646a00987
|
implement streaming on openai, improve streaming, run DB logging in background thread
|
2023-09-25 12:30:40 -06:00 |
Cyberes
|
bbe5d5a8fe
|
improve openai endpoint, exclude system tokens more places
|
2023-09-25 09:32:23 -06:00 |
Cyberes
|
320f51e01c
|
further align openai endpoint with expected responses
|
2023-09-24 21:45:30 -06:00 |
Cyberes
|
cb99c3490e
|
rewrite tokenizer, restructure validation
|
2023-09-24 13:02:30 -06:00 |
Cyberes
|
62412f4873
|
add config setting for hostname
|
2023-09-23 23:24:08 -06:00 |
Cyberes
|
84a1fcfdd8
|
don't store host if it's an IP
|
2023-09-23 23:14:22 -06:00 |
Cyberes
|
3c1254d3bf
|
cache stats in background
|
2023-09-17 18:55:36 -06:00 |
Cyberes
|
3100b0a924
|
set up queue to work with gunicorn processes, other improvements
|
2023-09-14 17:38:20 -06:00 |
Cyberes
|
a89295193f
|
add moderation endpoint to openai api, update config
|
2023-09-14 15:07:17 -06:00 |
Cyberes
|
8f4f17166e
|
adjust
|
2023-09-14 14:36:22 -06:00 |
Cyberes
|
93a344f4c5
|
check if the backend crapped out, print some more stuff
|
2023-09-14 14:26:25 -06:00 |
Cyberes
|
79b1e01b61
|
option to disable streaming, improve timeout on requests to backend, fix error handling. reduce duplicate code, misc other cleanup
|
2023-09-14 14:05:50 -06:00 |
Cyberes
|
12e894032e
|
show the openai system prompt
|
2023-09-13 20:25:56 -06:00 |
Cyberes
|
9740df07c7
|
add openai-compatible backend
|
2023-09-12 16:40:09 -06:00 |