hf_text-generation-inference/server/Makefile-flashinfer

3 lines
95 B
Plaintext

install-flashinfer:
pip install flashinfer==0.1.5 -i https://flashinfer.ai/whl/cu124/torch2.4