dan
|
4688bfff55
|
Add auto-sized cropping UI
|
2023-01-17 17:16:43 +08:00 |
|
Vladimir Mandic
|
110d1a2d59
|
add fields to settings file
|
2023-01-15 12:41:00 -05:00 |
|
AUTOMATIC
|
d8b90ac121
|
big rework of progressbar/preview system to allow multiple users to prompts at the same time and do not get previews of each other
|
2023-01-15 18:51:04 +03:00 |
|
AUTOMATIC
|
a95f135308
|
change hash to sha256
|
2023-01-14 09:56:59 +03:00 |
|
AUTOMATIC
|
82725f0ac4
|
fix a bug caused by merge
|
2023-01-13 15:04:37 +03:00 |
|
AUTOMATIC1111
|
9cd7716753
|
Merge branch 'master' into tensorboard
|
2023-01-13 14:57:38 +03:00 |
|
AUTOMATIC1111
|
544e7a233e
|
Merge pull request #6689 from Poktay/add_gradient_settings_to_logging_file
add gradient settings to training settings log files
|
2023-01-13 14:45:32 +03:00 |
|
AUTOMATIC
|
a176d89487
|
print bucket sizes for training without resizing images #6620
fix an error when generating a picture with embedding in it
|
2023-01-13 14:32:15 +03:00 |
|
AUTOMATIC1111
|
486bda9b33
|
Merge pull request #6620 from guaneec/varsize_batch
Enable batch_size>1 for mixed-sized training
|
2023-01-13 14:03:31 +03:00 |
|
Josh R
|
0b262802b8
|
add gradient settings to training settings log files
|
2023-01-12 17:31:05 -08:00 |
|
Shondoit
|
d52a80f7f7
|
Allow creation of zero vectors for TI
|
2023-01-12 09:22:29 +01:00 |
|
Vladimir Mandic
|
3f43d8a966
|
set descriptions
|
2023-01-11 10:28:55 -05:00 |
|
Lee Bousfield
|
f9706acf43
|
Support loading textual inversion embeddings from safetensors files
|
2023-01-10 18:40:34 -07:00 |
|
dan
|
6be644fa04
|
Enable batch_size>1 for mixed-sized training
|
2023-01-11 05:31:58 +08:00 |
|
AUTOMATIC
|
1fbb6f9ebe
|
make a dropdown for prompt template selection
|
2023-01-09 23:35:40 +03:00 |
|
AUTOMATIC
|
43bb5190fc
|
remove/simplify some changes from #6481
|
2023-01-09 22:52:23 +03:00 |
|
AUTOMATIC1111
|
18c001792a
|
Merge branch 'master' into varsize
|
2023-01-09 22:45:39 +03:00 |
|
AUTOMATIC
|
085427de0e
|
make it possible for extensions/scripts to add their own embedding directories
|
2023-01-08 09:37:33 +03:00 |
|
AUTOMATIC
|
a0c87f1fdf
|
skip images in embeddings dir if they have a second .preview extension
|
2023-01-08 08:52:26 +03:00 |
|
dan
|
72497895b9
|
Move batchsize check
|
2023-01-08 02:57:36 +08:00 |
|
dan
|
669fb18d52
|
Add checkbox for variable training dims
|
2023-01-08 02:31:40 +08:00 |
|
dan
|
448b9cedab
|
Allow variable img size
|
2023-01-08 02:14:36 +08:00 |
|
AUTOMATIC
|
79e39fae61
|
CLIP hijack rework
|
2023-01-07 01:46:13 +03:00 |
|
AUTOMATIC
|
683287d87f
|
rework saving training params to file #6372
|
2023-01-06 08:52:06 +03:00 |
|
AUTOMATIC1111
|
88e01b237e
|
Merge pull request #6372 from timntorres/save-ti-hypernet-settings-to-txt-revised
Save hypernet and textual inversion settings to text file, revised.
|
2023-01-06 07:59:44 +03:00 |
|
Faber
|
81133d4168
|
allow loading embeddings from subdirectories
|
2023-01-06 03:38:37 +07:00 |
|
Kuma
|
fda04e620d
|
typo in TI
|
2023-01-05 18:44:19 +01:00 |
|
timntorres
|
b6bab2f052
|
Include model in log file. Exclude directory.
|
2023-01-05 09:14:56 -08:00 |
|
timntorres
|
b85c2b5cf4
|
Clean up ti, add same behavior to hypernetwork.
|
2023-01-05 08:14:38 -08:00 |
|
timntorres
|
eea8fc40e1
|
Add option to save ti settings to file.
|
2023-01-05 07:24:22 -08:00 |
|
AUTOMATIC1111
|
eeb1de4388
|
Merge branch 'master' into gradient-clipping
|
2023-01-04 19:56:35 +03:00 |
|
AUTOMATIC
|
525cea9245
|
use shared function from processing for creating dummy mask when training inpainting model
|
2023-01-04 17:58:07 +03:00 |
|
AUTOMATIC
|
184e670126
|
fix the merge
|
2023-01-04 17:45:01 +03:00 |
|
AUTOMATIC1111
|
da5c1e8a73
|
Merge branch 'master' into inpaint_textual_inversion
|
2023-01-04 17:40:19 +03:00 |
|
AUTOMATIC1111
|
7bbd984dda
|
Merge pull request #6253 from Shondoit/ti-optim
Save Optimizer next to TI embedding
|
2023-01-04 14:09:13 +03:00 |
|
Vladimir Mandic
|
192ddc04d6
|
add job info to modules
|
2023-01-03 10:34:51 -05:00 |
|
Shondoit
|
bddebe09ed
|
Save Optimizer next to TI embedding
Also add check to load only .PT and .BIN files as embeddings. (since we add .optim files in the same directory)
|
2023-01-03 13:30:24 +01:00 |
|
Philpax
|
c65909ad16
|
feat(api): return more data for embeddings
|
2023-01-02 12:21:48 +11:00 |
|
AUTOMATIC
|
311354c0bb
|
fix the issue with training on SD2.0
|
2023-01-02 00:38:09 +03:00 |
|
AUTOMATIC
|
bdbe09827b
|
changed embedding accepted shape detection to use existing code and support the new alt-diffusion model, and reformatted messages a bit #6149
|
2022-12-31 22:49:09 +03:00 |
|
Vladimir Mandic
|
f55ac33d44
|
validate textual inversion embeddings
|
2022-12-31 11:27:02 -05:00 |
|
Yuval Aboulafia
|
3bf5591efe
|
fix F541 f-string without any placeholders
|
2022-12-24 21:35:29 +02:00 |
|
Jim Hays
|
c0355caefe
|
Fix various typos
|
2022-12-14 21:01:32 -05:00 |
|
AUTOMATIC1111
|
c9a2cfdf2a
|
Merge branch 'master' into racecond_fix
|
2022-12-03 10:19:51 +03:00 |
|
AUTOMATIC1111
|
a2feaa95fc
|
Merge pull request #5194 from brkirch/autocast-and-mps-randn-fixes
Use devices.autocast() and fix MPS randn issues
|
2022-12-03 09:58:08 +03:00 |
|
PhytoEpidemic
|
119a945ef7
|
Fix divide by 0 error
Fix of the edge case 0 weight that occasionally will pop up in some specific situations. This was crashing the script.
|
2022-12-02 12:16:29 -06:00 |
|
brkirch
|
4d5f1691dd
|
Use devices.autocast instead of torch.autocast
|
2022-11-30 10:33:42 -05:00 |
|
AUTOMATIC1111
|
39827a3998
|
Merge pull request #4688 from parasi22/resolve-embedding-name-in-filewords
resolve [name] after resolving [filewords] in training
|
2022-11-27 22:46:49 +03:00 |
|
AUTOMATIC
|
b48b7999c8
|
Merge remote-tracking branch 'flamelaw/master'
|
2022-11-27 12:19:59 +03:00 |
|
flamelaw
|
755df94b2a
|
set TI AdamW default weight decay to 0
|
2022-11-27 00:35:44 +09:00 |
|