hf_text-generation-inference/proto
OlivierDehaene e74bd41e0f
feat(server): add paged attention to flash models (#516)
Closes #478
2023-06-30 19:09:59 +02:00
..
generate.proto feat(server): add paged attention to flash models (#516) 2023-06-30 19:09:59 +02:00