hf_text-generation-inference/server/tests
OlivierDehaene 299217c95c
feat(server): add flash attention llama (#144)
2023-04-11 16:38:22 +02:00
..
models feat(server): add flash attention llama (#144) 2023-04-11 16:38:22 +02:00
utils fix(server): fix escape characters in stop sequence (#155) 2023-04-05 19:37:41 +02:00
conftest.py feat: support typical sampling (#114) 2023-03-09 11:33:57 +01:00