hf_text-generation-inference/server/text_generation_server/models
OlivierDehaene 44b267ab22 fix: fix gpt-q params loading 2023-12-14 11:02:16 +01:00
..
custom_modeling feat: add quant to mixtral (#1337) 2023-12-12 17:55:03 +01:00
__init__.py chore: formatting 2023-12-11 14:49:52 +01:00
bloom.py fix: fix gpt-q params loading 2023-12-14 11:02:16 +01:00
cache_manager.py feat: add mistral model (#1071) 2023-09-28 09:55:47 +02:00
causal_lm.py chore: formatting 2023-12-11 14:49:52 +01:00
flash_causal_lm.py chore: formatting 2023-12-11 14:49:52 +01:00
flash_llama.py fix: fix gpt-q params loading 2023-12-14 11:02:16 +01:00
flash_mistral.py fix: fix gpt-q params loading 2023-12-14 11:02:16 +01:00
flash_mixtral.py chore: formatting 2023-12-11 14:49:52 +01:00
flash_neox.py fix: fix gpt-q params loading 2023-12-14 11:02:16 +01:00
flash_rw.py fix: fix gpt-q params loading 2023-12-14 11:02:16 +01:00
flash_santacoder.py fix: fix gpt-q params loading 2023-12-14 11:02:16 +01:00
galactica.py fix: fix gpt-q params loading 2023-12-14 11:02:16 +01:00
gpt_neox.py fix: fix gpt-q params loading 2023-12-14 11:02:16 +01:00
idefics.py enable bfloat16 for cpu (#1034) 2023-09-19 17:19:28 +02:00
idefics_causal_lm.py chore: formatting 2023-12-11 14:49:52 +01:00
model.py chore: formatting 2023-12-11 14:49:52 +01:00
mpt.py fix: fix gpt-q params loading 2023-12-14 11:02:16 +01:00
opt.py fix: fix gpt-q params loading 2023-12-14 11:02:16 +01:00
rw.py enable bfloat16 for cpu (#1034) 2023-09-19 17:19:28 +02:00
santacoder.py enable bfloat16 for cpu (#1034) 2023-09-19 17:19:28 +02:00
seq2seq_lm.py chore: formatting 2023-12-11 14:49:52 +01:00
t5.py enable bfloat16 for cpu (#1034) 2023-09-19 17:19:28 +02:00
types.py chore: formatting 2023-12-11 14:49:52 +01:00