hf_text-generation-inference/backends
Morgan Funtowicz df72c56b5b feat(backend): add guard in case top_k = 0 2024-11-28 16:30:20 +01:00
..
client Choosing input/total tokens automatically based on available VRAM? (#2673) 2024-10-28 04:59:49 +01:00
grpc-metadata Rebase TRT-llm (#2331) 2024-07-31 10:33:10 +02:00
llamacpp feat(backend): add guard in case top_k = 0 2024-11-28 16:30:20 +01:00
trtllm chore: remove unrelated change to trtllm 2024-11-22 15:42:09 +01:00
v2 Fixing "deadlock" when python prompts for trust_remote_code by always (#2664) 2024-10-25 06:39:21 +02:00
v3 Choosing input/total tokens automatically based on available VRAM? (#2673) 2024-10-28 04:59:49 +01:00