OlivierDehaene
5b9de4a1d3
fix(server): blacklist local files ( #609 )
...
Close #589 #602
2023-07-13 21:54:55 +02:00
Nicolas Patry
e943a294bc
fix(server): harden the weights choice to save on disk. ( #561 )
...
- Look at `transformers` base class to check for
`_key_to_ignore_on_load_missing` or `_tied_weights` which are the
standard attributes to select the keys to NOT save on disk (since they
are ignored)
- Modified safetensors code (to be reflected in safetensors even if it's
an internal function).
- Will not work for trust_remote_code=True repos (like santacoder).
Should help with :
https://github.com/huggingface/text-generation-inference/issues/555
and : https://github.com/huggingface/text-generation-inference/pull/501
and https://github.com/huggingface/text-generation-inference/issues/556
and
https://github.com/huggingface/text-generation-inference/issues/482#issuecomment-1623713593
2023-07-07 14:50:12 +02:00
Nicolas Patry
49b4b33e80
feat(server): Update convert logic. ( #483 )
...
Should be more robust to shared tensors (ok when using
`from_pretrained). But forcing us to add new checks in our loading
code (since the chosen key to keep might be different from
`transformers`).
---------
Co-authored-by: Ubuntu <ubuntu@ip-172-31-41-161.ec2.internal>
2023-06-23 12:40:46 +02:00
OlivierDehaene
ece7ffa40a
feat(server): improve flash attention import errors ( #465 )
...
@lewtun, is this enough?
Closes #458
Closes #456
2023-06-19 09:53:45 +02:00
OlivierDehaene
62f91f78ac
feat(server): support vectorized warpers in flash causal lm ( #317 )
...
Co-authored-by: Joel Lamy-Poirier <joel.lamy-poirier@servicenow.com>
2023-05-26 12:30:27 +02:00
Nicolas Patry
b4aa87db58
fea(server): decrease convert RAM requirements ( #286 )
2023-05-05 17:57:02 +02:00
Nicolas Patry
690fc31757
fix(server): fix convert ( #284 )
2023-05-05 15:28:08 +02:00
Nicolas Patry
f08343d44d
fix(server): Removes the parallelism in file convertion (during download) ( #275 )
2023-05-04 15:22:54 +02:00
OlivierDehaene
3fef90d50f
feat(clients): Python client ( #103 )
2023-03-07 18:52:22 +01:00