hf_text-generation-inference/server
Daniël de Kok 9935720c87
Add support for repacking AWQ weights for GPTQ-Marlin (#2278)
* Add support for repacking AWQ weights for GPTQ-Marlin

So far we couldn't support AWQ because virtually all AWQ models use
symmetric quantization, which GPTQ-Marlin did not suppors. GPTQ-Marlin
has recently added support AWQ repacking and AWQ asymmetric quantization
(zero_point=True).

This change updates all GPTQ-Marlin kernels from upstream and wires up
AWQ support. For now enabling AWQ using Marlin requires running TGI with
`--quantize gptq`.

* Enable Marlin for supported AWQ configurations by default

This makes the AWQ -> GPTQ repack test redundant, since we are now
testing this with the regular AWQ test.
2024-07-23 13:08:20 +02:00
..
custom_kernels
exllama_kernels
exllamav2_kernels
marlin Add support for repacking AWQ weights for GPTQ-Marlin (#2278) 2024-07-23 13:08:20 +02:00
tests
text_generation_server Add support for repacking AWQ weights for GPTQ-Marlin (#2278) 2024-07-23 13:08:20 +02:00
.gitignore
Makefile feat(fp8): use fbgemm kernels and load fp8 weights directly (#2248) 2024-07-20 19:02:04 +02:00
Makefile-awq
Makefile-eetq
Makefile-fbgemm feat(fp8): use fbgemm kernels and load fp8 weights directly (#2248) 2024-07-20 19:02:04 +02:00
Makefile-flash-att
Makefile-flash-att-v2 Softcapping for gemma2. (#2273) 2024-07-22 18:27:10 +02:00
Makefile-lorax-punica
Makefile-selective-scan
Makefile-vllm
README.md
fbgemm_remove_unused.patch feat(fp8): use fbgemm kernels and load fp8 weights directly (#2248) 2024-07-20 19:02:04 +02:00
fix_torch90a.sh feat(fp8): use fbgemm kernels and load fp8 weights directly (#2248) 2024-07-20 19:02:04 +02:00
poetry.lock Softcapping for gemma2. (#2273) 2024-07-22 18:27:10 +02:00
pyproject.toml Softcapping for gemma2. (#2273) 2024-07-22 18:27:10 +02:00
requirements_cuda.txt Softcapping for gemma2. (#2273) 2024-07-22 18:27:10 +02:00
requirements_intel.txt Softcapping for gemma2. (#2273) 2024-07-22 18:27:10 +02:00
requirements_rocm.txt Softcapping for gemma2. (#2273) 2024-07-22 18:27:10 +02:00

README.md

Text Generation Inference Python gRPC Server

A Python gRPC server for Text Generation Inference

Install

make install

Run

make run-dev