hf_text-generation-inference/integration-tests
OlivierDehaene e74bd41e0f
feat(server): add paged attention to flash models (#516)
Closes #478
2023-06-30 19:09:59 +02:00
..
models feat(server): add paged attention to flash models (#516) 2023-06-30 19:09:59 +02:00
conftest.py feat(server): Rework model loading (#344) 2023-06-08 14:51:52 +02:00
pytest.ini feat(server): Rework model loading (#344) 2023-06-08 14:51:52 +02:00
requirements.txt feat(server): only compute prefill logprobs when asked (#406) 2023-06-02 17:12:30 +02:00