hf_text-generation-inference/integration-tests/models
OlivierDehaene 62f91f78ac
feat(server): support vectorized warpers in flash causal lm (#317)
Co-authored-by: Joel Lamy-Poirier <joel.lamy-poirier@servicenow.com>
2023-05-26 12:30:27 +02:00
..
__snapshots__ feat(server): support vectorized warpers in flash causal lm (#317) 2023-05-26 12:30:27 +02:00
test_bloom_560m.py feat(integration-tests): improve comparison and health checks (#336) 2023-05-16 20:22:11 +02:00
test_bloom_560m_sharded.py feat(integration-tests): improve comparison and health checks (#336) 2023-05-16 20:22:11 +02:00
test_flash_llama.py feat(server): support vectorized warpers in flash causal lm (#317) 2023-05-26 12:30:27 +02:00
test_flash_neox.py feat(server): support fp16 for t5 (#360) 2023-05-23 18:16:48 +02:00
test_flash_neox_sharded.py fix(server): fix init for flash causal lm (#352) 2023-05-22 15:05:32 +02:00
test_flash_santacoder.py feat(integration-tests): improve comparison and health checks (#336) 2023-05-16 20:22:11 +02:00
test_flash_starcoder.py feat(server): support vectorized warpers in flash causal lm (#317) 2023-05-26 12:30:27 +02:00
test_mt0_base.py feat(server): support vectorized warpers in flash causal lm (#317) 2023-05-26 12:30:27 +02:00
test_t5_sharded.py feat(server): support fp16 for t5 (#360) 2023-05-23 18:16:48 +02:00