Commit Graph

17 Commits

Author SHA1 Message Date
Morgan Funtowicz d4aee42fd8 feat(backend): add logit parameter in the callback fn 2024-11-14 08:42:01 +01:00
Morgan Funtowicz f39edc72ff feat(backend): add mapping for ignore_eos_token stopping criteria 2024-11-14 08:42:01 +01:00
Morgan Funtowicz d52b4c4978 feat(backend): full rework of the backend internal to safer c++ 2024-11-14 08:42:01 +01:00
Morgan Funtowicz b98c635781 feat(backend): entirely rewrite backend 2024-11-14 08:42:01 +01:00
Morgan Funtowicz 0c1dd0ed2b feat(llamacpp): wip explosion 2024-11-14 08:42:01 +01:00
Morgan Funtowicz a316c53255 feat(llamacpp): expose number of threads for the backend when constructing the model 2024-11-14 08:42:01 +01:00
Morgan Funtowicz e4d803c94e feat(backend): build and link through build.rs 2024-11-14 08:42:01 +01:00
Morgan Funtowicz 355d8a55b4 feat(backend): wip Rust binding 2024-11-14 08:42:01 +01:00
Morgan Funtowicz f9c248657d chore(backend): minor formatting 2024-11-14 08:42:01 +01:00
Morgan Funtowicz 37faeb34b2 feat(backend): expose frequency and repetition penalties 2024-11-14 08:42:01 +01:00
Morgan Funtowicz d4b5be10f9 feat(backend): minor refactor 2024-11-14 08:42:01 +01:00
Morgan Funtowicz 92bb113653 feat(backend): use llama_token as TokenId type 2024-11-14 08:42:01 +01:00
Morgan Funtowicz 45d5a6a8c5 feat(backend): add some initial decoding steps 2024-11-14 08:42:01 +01:00
Morgan Funtowicz 0911076320 feat(backend): correctly load llama.cpp model from llama api and not gpt2 2024-11-14 08:42:01 +01:00
Morgan Funtowicz 05ad684676 feat(llamacpp): enable cuda 2024-11-14 08:42:01 +01:00
Morgan Funtowicz 52d57dca79 feat(llamacpp): initial end2end build 2024-11-14 08:42:01 +01:00
Morgan Funtowicz aa1fcba59f feat(llamacpp): initial commit
# Conflicts:
#	Cargo.lock
2024-11-14 08:42:01 +01:00