hf_text-generation-inference/integration-tests/models
OlivierDehaene cfaa858070
feat(server): support fp16 for t5 (#360)
Fixes #349
2023-05-23 18:16:48 +02:00
..
__snapshots__ feat(server): support fp16 for t5 (#360) 2023-05-23 18:16:48 +02:00
test_bloom_560m.py feat(integration-tests): improve comparison and health checks (#336) 2023-05-16 20:22:11 +02:00
test_bloom_560m_sharded.py feat(integration-tests): improve comparison and health checks (#336) 2023-05-16 20:22:11 +02:00
test_flash_llama.py feat(integration-tests): improve comparison and health checks (#336) 2023-05-16 20:22:11 +02:00
test_flash_neox.py feat(server): support fp16 for t5 (#360) 2023-05-23 18:16:48 +02:00
test_flash_neox_sharded.py fix(server): fix init for flash causal lm (#352) 2023-05-22 15:05:32 +02:00
test_flash_santacoder.py feat(integration-tests): improve comparison and health checks (#336) 2023-05-16 20:22:11 +02:00
test_flash_starcoder.py feat(integration-tests): improve comparison and health checks (#336) 2023-05-16 20:22:11 +02:00
test_mt0_base.py feat(integration-tests): improve comparison and health checks (#336) 2023-05-16 20:22:11 +02:00
test_t5_sharded.py feat(server): support fp16 for t5 (#360) 2023-05-23 18:16:48 +02:00