hf_text-generation-inference/server/marlin/marlin_kernels/sparse
Daniël de Kok f1f98e369f
Add support for Marlin 2:4 sparsity (#2102)
This change adds support for 2:4 sparsity when using Marlin
quantization. The 2:4 kernel is used when:

* The quantizer is `marlin`;
* the quantizer checkpoint format is `marlin_24`.

Fixes #2098.
2024-06-25 21:09:42 +02:00
..
common Add support for Marlin 2:4 sparsity (#2102) 2024-06-25 21:09:42 +02:00
marlin_24_cuda_kernel.cu Add support for Marlin 2:4 sparsity (#2102) 2024-06-25 21:09:42 +02:00