flamelaw
89d8ecff09
small fixes
2022-11-23 02:49:01 +09:00
flamelaw
5b57f61ba4
fix pin_memory with different latent sampling method
2022-11-21 10:15:46 +09:00
flamelaw
bd68e35de3
Gradient accumulation, autocast fix, new latent sampling method, etc
2022-11-20 12:35:26 +09:00
AUTOMATIC
cdc8020d13
change StableDiffusionProcessing to internally use sampler name instead of sampler index
2022-11-19 12:01:51 +03:00
AUTOMATIC
62e3d71aa7
rework the code to not use the walrus operator because colab's 3.7 does not support it
2022-11-05 17:09:42 +03:00
AUTOMATIC1111
cb84a304f0
Merge pull request #4273 from Omegastick/ordered_hypernetworks
...
Sort hypernetworks list
2022-11-05 16:16:18 +03:00
Isaac Poulton
08feb4c364
Sort straight out of the glob
2022-11-04 20:53:11 +07:00
Isaac Poulton
fd62727893
Sort hypernetworks
2022-11-04 18:34:35 +07:00
aria1th
1ca0bcd3a7
only save if option is enabled
2022-11-04 16:09:19 +09:00
aria1th
f5d394214d
split before declaring file name
2022-11-04 16:04:03 +09:00
aria1th
283249d239
apply
2022-11-04 15:57:17 +09:00
AngelBottomless
179702adc4
Merge branch 'AUTOMATIC1111:master' into force-push-patch-13
2022-11-04 15:51:09 +09:00
AngelBottomless
0d07cbfa15
I blame code autocomplete
2022-11-04 15:50:54 +09:00
aria1th
0abb39f461
resolve conflict - first revert
2022-11-04 15:47:19 +09:00
AUTOMATIC1111
4918eb6ce4
Merge branch 'master' into hn-activation
2022-11-04 09:02:15 +03:00
aria1th
1764ac3c8b
use hash to check valid optim
2022-11-03 14:49:26 +09:00
aria1th
0b143c1163
Separate .optim file from model
2022-11-03 14:30:53 +09:00
aria1th
9d96d7d0a0
resolve conflicts
2022-10-30 20:40:59 +09:00
AngelBottomless
20194fd975
We have duplicate linear now
2022-10-30 20:40:59 +09:00
AUTOMATIC1111
17a2076f72
Merge pull request #3928 from R-N/validate-before-load
...
Optimize training a little
2022-10-30 09:51:36 +03:00
Muhammad Rizqi Nur
3d58510f21
Fix dataset still being loaded even when training will be skipped
2022-10-30 00:54:59 +07:00
Muhammad Rizqi Nur
a07f054c86
Add missing info on hypernetwork/embedding model log
...
Mentioned here: https://github.com/AUTOMATIC1111/stable-diffusion-webui/discussions/1528#discussioncomment-3991513
Also group the saving into one
2022-10-30 00:49:29 +07:00
Muhammad Rizqi Nur
ab05a74ead
Revert "Add cleanup after training"
...
This reverts commit 3ce2bfdf95
.
2022-10-30 00:32:02 +07:00
Muhammad Rizqi Nur
3ce2bfdf95
Add cleanup after training
2022-10-29 19:43:21 +07:00
Muhammad Rizqi Nur
ab27c111d0
Add input validations before loading dataset for training
2022-10-29 18:09:17 +07:00
timntorres
e98f72be33
Merge branch 'AUTOMATIC1111:master' into 3825-save-hypernet-strength-to-info
2022-10-29 00:31:23 -07:00
AUTOMATIC1111
810e6a407d
Merge pull request #3858 from R-N/log-csv
...
Fix log off by 1 #3847
2022-10-29 07:55:20 +03:00
AUTOMATIC1111
d3b4b9d7ec
Merge pull request #3717 from benkyoujouzu/master
...
Add missing support for linear activation in hypernetwork
2022-10-29 07:30:14 +03:00
AngelBottomless
f361e804eb
Re enable linear
2022-10-29 08:36:50 +09:00
Muhammad Rizqi Nur
9ceef81f77
Fix log off by 1
2022-10-28 20:48:08 +07:00
timntorres
db5a354c48
Always ignore "None.pt" in the hypernet directory.
2022-10-28 01:41:57 -07:00
benkyoujouzu
b2a8b263b2
Add missing support for linear activation in hypernetwork
2022-10-28 12:54:59 +08:00
AngelBottomless
462e6ba667
Disable unavailable or duplicate options
2022-10-27 15:40:24 +09:00
AngelBottomless
029d7c7543
Revert unresolved changes in Bias initialization
...
it should be zeros_ or parameterized in future properly.
2022-10-27 14:44:53 +09:00
guaneec
cc56df996e
Fix dropout logic
2022-10-27 14:38:21 +09:00
AngelBottomless
85fcccc105
Squashed commit of fixing dropout silently
...
fix dropouts for future hypernetworks
add kwargs for Hypernetwork class
hypernet UI for gradio input
add recommended options
remove as options
revert adding options in ui
2022-10-27 14:38:21 +09:00
guaneec
b6a8bb123b
Fix merge
2022-10-26 15:15:19 +08:00
timntorres
a524d137d0
patch bug (SeverianVoid's comment on 5245c7a
)
2022-10-26 10:12:46 +03:00
guaneec
91bb35b1e6
Merge fix
2022-10-26 15:00:03 +08:00
guaneec
649d79a8ec
Merge branch 'master' into hn-activation
2022-10-26 14:58:04 +08:00
guaneec
877d94f97c
Back compatibility
2022-10-26 14:50:58 +08:00
AngelBottomless
7207e3bf49
remove duplicate keys and lowercase
2022-10-26 09:17:01 +03:00
AngelBottomless
de096d0ce7
Weight initialization and More activation func
...
add weight init
add weight init option in create_hypernetwork
fstringify hypernet info
save weight initialization info for further debugging
fill bias with zero for He/Xavier
initialize LayerNorm with Normal
fix loading weight_init
2022-10-26 09:17:01 +03:00
guaneec
c702d4d0df
Fix off-by-one
2022-10-26 13:43:04 +08:00
guaneec
2f4c91894d
Remove activation from final layer of HNs
2022-10-26 12:10:30 +08:00
AngelBottomless
e9a410b535
check length for variance
2022-10-24 09:07:39 +03:00
AngelBottomless
0d2e1dac40
convert deque -> list
...
I don't feel this being efficient
2022-10-24 09:07:39 +03:00
AngelBottomless
348f89c8d4
statistics for pbar
2022-10-24 09:07:39 +03:00
AngelBottomless
40b56c9289
cleanup some code
2022-10-24 09:07:39 +03:00
AngelBottomless
b297cc3324
Hypernetworks - fix KeyError in statistics caching
...
Statistics logging has changed to {filename : list[losses]}, so it has to use loss_info[key].pop()
2022-10-24 09:07:39 +03:00