hf_text-generation-inference/server/text_generation/models
OlivierDehaene 2ad895a6cc
feat(server): allow gpt-neox models with odd vocab sizes to be sharded (#48)
2023-02-01 14:43:59 +01:00
..
__init__.py feat(server): Support GPT-Neox (#39) 2023-01-31 18:53:56 +01:00
bloom.py feat(server): Support GPT-Neox (#39) 2023-01-31 18:53:56 +01:00
causal_lm.py feat(server): Support GPT-Neox (#39) 2023-01-31 18:53:56 +01:00
galactica.py feat(server): Support GPT-Neox (#39) 2023-01-31 18:53:56 +01:00
gpt_neox.py feat(server): allow gpt-neox models with odd vocab sizes to be sharded (#48) 2023-02-01 14:43:59 +01:00
model.py fix(server): Minor refactorization using new_zeros (#24) 2023-01-17 09:10:22 +01:00
santacoder.py feat(server): Support GPT-Neox (#39) 2023-01-31 18:53:56 +01:00
seq2seq_lm.py feat(server): Support GPT-Neox (#39) 2023-01-31 18:53:56 +01:00
types.py feat: Add token streaming using ServerSideEvents support (#41) 2023-01-31 17:04:00 +01:00