Commit Graph

30 Commits

Author SHA1 Message Date
v0xie 9588721197 feat: support LyCORIS BOFT 2024-02-07 04:49:17 -08:00
v0xie fd383140cf fix: wrong devices for eye and constraint 2024-01-22 02:52:34 -08:00
Kohaku-Blueleaf f8f38c7c28 Fix dtype casting for OFT module 2024-01-05 16:31:48 +08:00
Kohaku-Blueleaf 3772a82a70 better naming and correct order for device. 2023-12-14 01:47:13 +08:00
Kohaku-Blueleaf 8fc67f3851 remove debug print 2023-12-14 01:44:49 +08:00
Kohaku-Blueleaf 265bc26c21 Use self.scale instead of custom finalize 2023-12-14 01:43:24 +08:00
Kohaku-Blueleaf 735c9e8059 Fix network_oft 2023-12-14 01:38:32 +08:00
v0xie eb667e715a feat: LyCORIS/kohya OFT network support 2023-11-15 18:28:48 -08:00
v0xie d6d0b22e66 fix: ignore calc_scale() for COFT which has very small alpha 2023-11-15 03:08:50 -08:00
v0xie bbf00a96af refactor: remove unused function 2023-11-04 14:56:47 -07:00
v0xie 329c8bacce refactor: use same updown for both kohya OFT and LyCORIS diag-oft 2023-11-04 14:54:36 -07:00
v0xie f6c8201e56 refactor: move factorization to lyco_helpers, separate calc_updown for kohya and kb 2023-11-03 19:35:15 -07:00
v0xie fe1967a4c4 skip multihead attn for now 2023-11-03 17:52:55 -07:00
v0xie d727ddfccd no idea what i'm doing, trying to support both type of OFT, kblueleaf diag_oft has MultiheadAttn which kohya's doesn't?, attempt create new module based off network_lora.py, errors about tensor dim mismatch 2023-11-02 00:13:11 -07:00
v0xie a2fad6ee05 test implementation based on kohaku diag-oft implementation 2023-11-01 22:34:27 -07:00
v0xie 6523edb8a4 style: conform style 2023-10-22 09:31:15 -07:00
v0xie 3b8515d2c9 fix: multiplier applied twice in finalize_updown 2023-10-22 09:27:48 -07:00
v0xie 4a50c9638c refactor: remove used OFT functions 2023-10-22 08:54:24 -07:00
v0xie de8ee92ed8 fix: use merge_weight to cache value 2023-10-21 17:37:17 -07:00
v0xie 76f5abdbdb style: cleanup oft 2023-10-21 16:07:45 -07:00
v0xie fce86ab7d7 fix: support multiplier, no forward pass hook 2023-10-21 16:03:54 -07:00
v0xie 7683547728 fix: return orig weights during updown, merge weights before forward 2023-10-21 14:42:24 -07:00
v0xie 2d8c894b27 refactor: use forward hook instead of custom forward 2023-10-21 13:43:31 -07:00
v0xie 0550659ce6 style: fix ambiguous variable name 2023-10-19 13:13:02 -07:00
v0xie d10c4db57e style: formatting 2023-10-19 12:52:14 -07:00
v0xie 321680ccd0 refactor: fix constraint, re-use get_weight 2023-10-19 12:41:17 -07:00
v0xie eb01d7f0e0 faster by calculating R in updown and using cached R in forward 2023-10-18 04:56:53 -07:00
v0xie 853e21d98e faster by using cached R in forward 2023-10-18 04:27:44 -07:00
v0xie 1c6efdbba7 inference working but SLOW 2023-10-18 04:16:01 -07:00
v0xie ec718f76b5 wip incorrect OFT implementation 2023-10-17 23:35:50 -07:00