models
|
small fix on idefics (#954)
|
2023-09-01 18:44:34 +02:00 |
pb
|
feat(server): clear cache on error (#143)
|
2023-03-28 11:29:35 +02:00 |
utils
|
Fixing top_k tokens when k ends up < 0 (#966)
|
2023-09-01 00:22:03 +02:00 |
__init__.py
|
feat(clients): Python client (#103)
|
2023-03-07 18:52:22 +01:00 |
cli.py
|
Fixing the lora adaptation on docker. (#935)
|
2023-08-28 11:13:24 +02:00 |
interceptor.py
|
feat(server): empty cache on errors
|
2023-07-12 17:06:19 +02:00 |
server.py
|
Adding Idefics multi modal model. (#842)
|
2023-08-17 14:38:49 +02:00 |
tracing.py
|
feat(clients): Python client (#103)
|
2023-03-07 18:52:22 +01:00 |