hf_text-generation-inference/server/text_generation_server/layers/attention
drbh 1cebccc72b
fix: adds causal to attention params (#2408)
fix: adds causal to attention params to check when using flash attn v1
2024-08-13 16:19:46 +02:00
..
__init__.py feat: add ruff and resolve issue (#2262) 2024-07-26 10:29:09 -04:00
common.py Using an enum for flash backens (paged/flashdecoding/flashinfer) (#2385) 2024-08-09 16:41:17 +02:00
cuda.py fix: adds causal to attention params (#2408) 2024-08-13 16:19:46 +02:00
flash_attn_triton.py feat: add ruff and resolve issue (#2262) 2024-07-26 10:29:09 -04:00
flash_infer.py Add FlashInfer support (#2354) 2024-08-09 11:42:00 +02:00
ipex.py Pr 2337 ci branch (#2379) 2024-08-08 12:30:29 -04:00
rocm.py Using an enum for flash backens (paged/flashdecoding/flashinfer) (#2385) 2024-08-09 16:41:17 +02:00