Commit Graph

191 Commits

Author SHA1 Message Date
Anthony Mercurio 511ee9e6d2
Merge pull request #47 from chavinlo/patch-2
Move the model to device BEFORE creating the optimizer
2022-11-20 10:06:50 -05:00
Carlos Chavez f2cfe65d09
Move the movel to device BEFORE creating the optimizer
>It shouldn’t matter, as the optimizer should hold the references to the parameter (even after moving them). However, the “safer” approach would be to move the model to the device first and create the optimizer afterwards.

https://discuss.pytorch.org/t/should-i-create-optimizer-after-sending-the-model-to-gpu/133418/2
https://discuss.pytorch.org/t/effect-of-calling-model-cuda-after-constructing-an-optimizer/15165

At least in my experience with hivemind, if you initialize the optimizer and move the model afterwards, it will throw errors about finding some data in CPU and other on GPU. This shouldn't affect performance or anything I believe.
2022-11-20 00:09:35 -05:00
Anthony Mercurio 1d1f4022d2
Merge pull request #45 from lopho/patch-2
fix wandb init mode, don't log hf token
2022-11-16 16:32:50 -05:00
lopho 9916294de1
fix wandb init mode, don't log hf token
correct value for mode ('enabled' is invalid)
clear hf_token passed to wandb to avoid logging it
2022-11-16 22:28:16 +01:00
Anthony Mercurio c8eeaaf353
Merge pull request #42 from chavinlo/inference-option
Add options and local inference
2022-11-16 16:21:15 -05:00
Anthony Mercurio dc5849b235
Merge branch 'main' into inference-option 2022-11-16 16:20:57 -05:00
chavinlo a2772fc668 fixes 2022-11-16 10:55:38 -05:00
chavinlo fed3431f03 Revert "sync trainer with main branch"
This reverts commit 80e2422967.
2022-11-16 10:44:39 -05:00
Carlos Chavez 80e2422967
sync trainer with main branch 2022-11-16 10:39:20 -05:00
Anthony Mercurio da4f4b93ab
Merge pull request #43 from Maw-Fox/staging-docfix
Minor fix: Documentation Consistency
2022-11-15 13:30:27 -05:00
Maw-Fox 015eeae274
Documentation consistency. 2022-11-15 10:34:55 -07:00
Anthony Mercurio 29ffbd645e
Fix noise scheduler 2022-11-15 11:08:38 -05:00
Anthony Mercurio 5be5a487b2
Merge pull request #39 from Maw-Fox/staging-migration
Implementation of validation/resize classes
2022-11-15 10:33:47 -05:00
Maw-Fox 6c5b2e7149
Fix of fix 2022-11-15 07:15:18 -07:00
Maw-Fox 2c18d29613
Fix from upstream merge. 2022-11-15 06:42:14 -07:00
Maw-Fox b3b5523d85
Merge branch 'staging-migration' of https://github.com/maw-fox/waifu-diffusion into staging-migration 2022-11-14 20:15:15 -07:00
Carlos Chavez d600078008
Add options and local inference
Added options to:
- Disable Inference (it consumes about 2gb of VRAM even when not active)
- Disable wandb

and:
- if no hftoken is provided it just fills it with nothing so it doesn't argues
- if wandb is not enabled, save the inference outputs to a local folder along with information about it
2022-11-14 22:08:16 -05:00
Maw-Fox 773e65f324
Merge origin:main into remote:staging-migration 2022-11-14 19:59:45 -07:00
Anthony Mercurio 5c205524e5
Merge branch 'main' into staging-migration 2022-11-14 12:31:05 -05:00
Anthony Mercurio ae561d19f7
Merge pull request #40 from lopho/patch-1
Parse booleans in argument parser
2022-11-14 12:29:31 -05:00
Maw-Fox 978dd45072
Fix. 2022-11-13 08:24:40 -07:00
Maw-Fox 4943d978c1
Fix redundancies. 2022-11-13 08:22:44 -07:00
Maw-Fox 95b9407a3e
Add+config .gitignore (bring back git stage) and fix up documentation. 2022-11-12 18:48:16 -07:00
Maw-Fox 6bd6c6a4ef
Fixed/flipped help text. 2022-11-12 16:11:30 -07:00
Maw-Fox 189f621a1e
Here, let's fix this while we're at it. 2022-11-12 15:47:17 -07:00
lopho cd0910e82d
Parse booleans in argument parser
true, yes or 1 correspond to True, else False.
2022-11-12 11:14:48 +01:00
Maw-Fox d1eb3ace3f I lied. 2022-11-11 18:17:29 -07:00
Maw-Fox 6c2d5d8066 Final cleanup. 2022-11-11 18:09:09 -07:00
Maw-Fox de221ea42e Derp. ImageStore.__init__ already iterates fully :) 2022-11-11 17:59:32 -07:00
Maw-Fox 925eacf374 Cleanup 2022-11-11 17:50:23 -07:00
Maw-Fox c12cbfced3 Fixed ref typo. 2022-11-11 17:43:09 -07:00
Maw-Fox 6480336d2c Cleanup test code. 2022-11-11 17:17:50 -07:00
Maw-Fox 120d406355 Implementation of validation/resize classes. 2022-11-11 17:14:46 -07:00
Anthony Mercurio 624f0f14af
correctly set args 2022-11-11 08:25:27 -07:00
Anthony Mercurio 033c1e75d0
Merge pull request #36 from harubaru/relicense
Code overhaul & relicense to AGPL 3.0
2022-11-11 07:45:08 -07:00
harubaru 15e70690de Add aesthetic 2022-11-10 13:01:11 -07:00
harubaru 1f5b671b67 relicense 2022-11-10 12:59:53 -07:00
Anthony Mercurio bc626e80e1
Merge pull request #34 from chavinlo/patch-1
typo
2022-11-10 09:23:10 -07:00
Anthony Mercurio 548ebea881
Merge pull request #32 from Maw-Fox/main
Add resize argument, various fixes
2022-11-10 09:22:48 -07:00
Anthony Mercurio 4e89309d6b
Merge branch 'main' into main 2022-11-10 09:22:37 -07:00
Anthony Mercurio 2f8d71c589
Merge pull request #33 from laksjdjf/sub
Apply xformers in diffusers_trainer.py
2022-11-10 09:21:30 -07:00
Carlos Chavez c4b1860282
typo
lol
2022-11-10 10:13:14 -05:00
Maw-Fox 46e8d98d2b Revert 2022-11-10 06:11:09 -07:00
Maw-Fox 0be39a4887 Fix depreciated enum 2022-11-10 05:42:38 -07:00
laksjdjf 353200b039
Update diffusers_trainer.py 2022-11-09 23:56:49 +00:00
Maw-Fox 3fe9df1450 Cleanup. 2022-11-08 20:50:57 -07:00
laksjdjf 0c29d1e84d
Update diffusers_trainer.py 2022-11-09 10:34:49 +09:00
Maw-Fox fe30e8942e Add resize and optional resize arg 2022-11-08 18:17:31 -07:00
Anthony Mercurio 1ea31eb71e
Merge pull request #31 from john-sungjin/fix-ema-checkpoint-saving
Restore non-EMA weights after saving checkpoint
2022-11-06 16:44:11 -08:00
John Kim 94603d9578 Added store/restore to EMAModel, restore non-EMA weights after saving checkpoint 2022-11-07 00:28:55 +00:00