hf_text-generation-inference/server/text_generation_server/layers/moe
Daniël de Kok 3f14cd1420
Add `DenseMoELayer` and wire it up in Mixtral/Deepseek V2 (#2537)
This replaces the custom layers in both models.
2024-09-24 14:27:06 +02:00
..
__init__.py Add `DenseMoELayer` and wire it up in Mixtral/Deepseek V2 (#2537) 2024-09-24 14:27:06 +02:00
unquantized.py Move to moe-kernels package and switch to common MoE layer (#2511) 2024-09-17 18:08:58 +02:00