benkyoujouzu
b2a8b263b2
Add missing support for linear activation in hypernetwork
2022-10-28 12:54:59 +08:00
Muhammad Rizqi Nur
2a25729623
Gradient clipping in train tab
2022-10-28 09:44:56 +07:00
AngelBottomless
462e6ba667
Disable unavailable or duplicate options
2022-10-27 15:40:24 +09:00
AngelBottomless
029d7c7543
Revert unresolved changes in Bias initialization
...
it should be zeros_ or parameterized in future properly.
2022-10-27 14:44:53 +09:00
guaneec
cc56df996e
Fix dropout logic
2022-10-27 14:38:21 +09:00
AngelBottomless
85fcccc105
Squashed commit of fixing dropout silently
...
fix dropouts for future hypernetworks
add kwargs for Hypernetwork class
hypernet UI for gradio input
add recommended options
remove as options
revert adding options in ui
2022-10-27 14:38:21 +09:00
guaneec
b6a8bb123b
Fix merge
2022-10-26 15:15:19 +08:00
timntorres
a524d137d0
patch bug (SeverianVoid's comment on 5245c7a
)
2022-10-26 10:12:46 +03:00
guaneec
91bb35b1e6
Merge fix
2022-10-26 15:00:03 +08:00
guaneec
649d79a8ec
Merge branch 'master' into hn-activation
2022-10-26 14:58:04 +08:00
guaneec
877d94f97c
Back compatibility
2022-10-26 14:50:58 +08:00
AngelBottomless
7207e3bf49
remove duplicate keys and lowercase
2022-10-26 09:17:01 +03:00
AngelBottomless
de096d0ce7
Weight initialization and More activation func
...
add weight init
add weight init option in create_hypernetwork
fstringify hypernet info
save weight initialization info for further debugging
fill bias with zero for He/Xavier
initialize LayerNorm with Normal
fix loading weight_init
2022-10-26 09:17:01 +03:00
guaneec
c702d4d0df
Fix off-by-one
2022-10-26 13:43:04 +08:00
guaneec
2f4c91894d
Remove activation from final layer of HNs
2022-10-26 12:10:30 +08:00
AngelBottomless
e9a410b535
check length for variance
2022-10-24 09:07:39 +03:00
AngelBottomless
0d2e1dac40
convert deque -> list
...
I don't feel this being efficient
2022-10-24 09:07:39 +03:00
AngelBottomless
348f89c8d4
statistics for pbar
2022-10-24 09:07:39 +03:00
AngelBottomless
40b56c9289
cleanup some code
2022-10-24 09:07:39 +03:00
AngelBottomless
b297cc3324
Hypernetworks - fix KeyError in statistics caching
...
Statistics logging has changed to {filename : list[losses]}, so it has to use loss_info[key].pop()
2022-10-24 09:07:39 +03:00
DepFA
1fbfc052eb
Update hypernetwork.py
2022-10-23 08:34:33 +03:00
AngelBottomless
48dbf99e84
Allow tracking real-time loss
...
Someone had 6000 images in their dataset, and it was shown as 0, which was confusing.
This will allow tracking real time dataset-average loss for registered objects.
2022-10-22 22:24:19 +03:00
AngelBottomless
24694e5983
Update hypernetwork.py
2022-10-22 20:25:32 +03:00
discus0434
6a4fa73a38
small fix
2022-10-22 13:44:39 +00:00
discus0434
97749b7c7d
Merge branch 'AUTOMATIC1111:master' into master
2022-10-22 22:00:59 +09:00
discus0434
7912acef72
small fix
2022-10-22 13:00:44 +00:00
discus0434
fccba4729d
add an option to avoid dying relu
2022-10-22 12:02:41 +00:00
AUTOMATIC
7fd90128eb
added a guard for hypernet training that will stop early if weights are getting no gradients
2022-10-22 14:48:43 +03:00
discus0434
dcb45dfecf
Merge branch 'master' of upstream
2022-10-22 11:14:46 +00:00
discus0434
0e8ca8e7af
add dropout
2022-10-22 11:07:00 +00:00
timntorres
272fa527bb
Remove unused variable.
2022-10-21 16:52:24 +03:00
timntorres
19818f023c
Match hypernet name with filename in all cases.
2022-10-21 16:52:24 +03:00
timntorres
51e3dc9cca
Sanitize hypernet name input.
2022-10-21 16:52:24 +03:00
AUTOMATIC
03a1e288c4
turns out LayerNorm also has weight and bias and needs to be pre-multiplied and trained for hypernets
2022-10-21 10:13:24 +03:00
AUTOMATIC1111
0c5522ea21
Merge branch 'master' into training-help-text
2022-10-21 09:57:55 +03:00
timntorres
4ff274e1e3
Revise comments.
2022-10-21 09:55:00 +03:00
timntorres
5245c7a493
Issue #2921-Give PNG info to Hypernet previews.
2022-10-21 09:55:00 +03:00
AUTOMATIC
c23f666dba
a more strict check for activation type and a more reasonable check for type of layer in hypernets
2022-10-21 09:47:43 +03:00
aria1th
f89829ec3a
Revert "fix bugs and optimizations"
...
This reverts commit 108be15500
.
2022-10-21 01:37:11 +09:00
AngelBottomless
108be15500
fix bugs and optimizations
2022-10-21 01:00:41 +09:00
AngelBottomless
a71e021236
only linear
2022-10-20 23:48:52 +09:00
AngelBottomless
d8acd34f66
generalized some functions and option for ignoring first layer
2022-10-20 23:43:03 +09:00
discus0434
6b38c2c19c
Merge branch 'AUTOMATIC1111:master' into master
2022-10-20 18:51:12 +09:00
AUTOMATIC
930b4c64f7
allow float sizes for hypernet's layer_structure
2022-10-20 08:18:02 +03:00
discus0434
6f98e89486
update
2022-10-20 00:10:45 +00:00
DepFA
166be3919b
allow overwrite old hn
2022-10-20 00:09:40 +01:00
DepFA
d6ea584137
change html output
2022-10-20 00:07:57 +01:00
discus0434
2ce52d32e4
fix for #3086 failing to load any previous hypernet
2022-10-19 16:31:12 +00:00
AUTOMATIC
c6e9fed500
fix for #3086 failing to load any previous hypernet
2022-10-19 19:21:16 +03:00
discus0434
3770b8d2fa
enable to write layer structure of hn himself
2022-10-19 15:28:42 +00:00
discus0434
42fbda83bb
layer options moves into create hnet ui
2022-10-19 14:30:33 +00:00
discus0434
7f8670c4ef
Merge branch 'master' into master
2022-10-19 15:18:45 +09:00
Silent
da72becb13
Use training width/height when training hypernetworks.
2022-10-19 09:13:28 +03:00
discus0434
e40ba281f1
update
2022-10-19 01:03:58 +09:00
discus0434
a5611ea502
update
2022-10-19 01:00:01 +09:00
discus0434
6021f7a75f
add options to custom hypernetwork layer structure
2022-10-19 00:51:36 +09:00
AngelBottomless
703e6d9e4e
check NaN for hypernetwork tuning
2022-10-15 17:15:26 +03:00
AUTOMATIC
c7a86f7fe9
add option to use batch size for training
2022-10-15 09:24:59 +03:00
AUTOMATIC
03d62538ae
remove duplicate code for log loss, add step, make it read from options rather than gradio input
2022-10-14 22:43:55 +03:00
AUTOMATIC
326fe7d44b
Merge remote-tracking branch 'Melanpan/master'
2022-10-14 22:14:50 +03:00
AUTOMATIC
c344ba3b32
add option to read generation params for learning previews from txt2img
2022-10-14 20:31:49 +03:00
AUTOMATIC
354ef0da3b
add hypernetwork multipliers
2022-10-13 20:12:37 +03:00
Melan
8636b50aea
Add learn_rate to csv and removed a left-over debug statement
2022-10-13 12:37:58 +02:00
Melan
1cfc2a1898
Save a csv containing the loss while training
2022-10-12 23:36:29 +02:00
AUTOMATIC
c3c8eef9fd
train: change filename processing to be more simple and configurable
...
train: make it possible to make text files with prompts
train: rework scheduler so that there's less repeating code in textual inversion and hypernets
train: move epochs setting to options
2022-10-12 20:49:47 +03:00
AUTOMATIC
ee015a1af6
change textual inversion tab to train
...
remake train interface to use tabs
2022-10-12 11:05:57 +03:00
Milly
2d006ce16c
xy_grid: Find hypernetwork by closest name
2022-10-12 10:40:10 +03:00
AUTOMATIC
6be32b31d1
reports that training with medvram is possible.
2022-10-11 23:07:09 +03:00
AUTOMATIC
d6fcc6b87b
apply lr schedule to hypernets
2022-10-11 22:03:05 +03:00
AUTOMATIC
6a9ea5b41c
prevent extra modules from being saved/loaded with hypernet
2022-10-11 19:22:30 +03:00
AUTOMATIC
d4ea5f4d86
add an option to unload models during hypernetwork training to save VRAM
2022-10-11 19:03:08 +03:00
AUTOMATIC
6d09b8d1df
produce error when training with medvram/lowvram enabled
2022-10-11 18:33:57 +03:00
AUTOMATIC
d682444ecc
add option to select hypernetwork modules when creating
2022-10-11 18:04:47 +03:00
AUTOMATIC
b0583be088
more renames
2022-10-11 15:54:34 +03:00
AUTOMATIC
873efeed49
rename hypernetwork dir to hypernetworks to prevent clash with an old filename that people who use zip instead of git clone will have
2022-10-11 15:51:30 +03:00