custom_modeling
|
enable mllama in intel platform (#2610)
|
2024-10-07 21:15:09 +02:00 |
__init__.py
|
Add basic FP8 KV cache support (#2603)
|
2024-10-04 17:51:48 +02:00 |
causal_lm.py
|
Fixing linters. (#2650)
|
2024-10-15 12:43:49 +02:00 |
galactica.py
|
feat: add ruff and resolve issue (#2262)
|
2024-07-26 10:29:09 -04:00 |
idefics_causal_lm.py
|
Mllama flash version (#2585)
|
2024-10-02 11:22:13 +02:00 |
mllama_causal_lm.py
|
Mllama flash version (#2585)
|
2024-10-02 11:22:13 +02:00 |
model.py
|
feat: add ruff and resolve issue (#2262)
|
2024-07-26 10:29:09 -04:00 |
pali_gemma.py
|
feat: add ruff and resolve issue (#2262)
|
2024-07-26 10:29:09 -04:00 |
seq2seq_lm.py
|
Fixing linters. (#2650)
|
2024-10-15 12:43:49 +02:00 |
types.py
|
feat: add ruff and resolve issue (#2262)
|
2024-07-26 10:29:09 -04:00 |