Commit graph

757 commits

Author SHA1 Message Date
brkirch
57e03cdd24 Ensure the directory exists before saving to it
The directory for the images saved with the Save button may still not exist, so it needs to be created prior to opening the log.csv file.
2022-10-12 09:55:56 +03:00
AUTOMATIC
336bd8703c just add the deepdanbooru settings unconditionally 2022-10-12 09:00:07 +03:00
AUTOMATIC
ee10c41e2a Merge remote-tracking branch 'origin/steve3d' 2022-10-12 08:35:52 +03:00
AUTOMATIC1111
2e2d45b281
Merge pull request #2143 from JC-Array/deepdanbooru_pre_process
deepbooru tags for textual inversion preproccessing
2022-10-12 08:35:27 +03:00
Greg Fuller
fec2221eea Truncate error text to fix service lockup / stall
What:
* Update wrap_gradio_call to add a limit to the maximum amount of text output

Why:
* wrap_gradio_call currently prints out a list of the arguments provided to the failing function.
   * if that function is save_image, this causes the entire image to be printed to stderr
    * If the image is large, this can cause the service to lock up while attempting to print all the text
 * It is easy to generate large images using the x/y plot script
 * it is easy to encounter image save exceptions, including if the output directory does not exist / cannot be written to, or if the file is too big
  * The huge amount of log spam is confusing and not particularly helpful
2022-10-12 08:30:06 +03:00
AUTOMATIC
6ac2ec2b78 create dir for hypernetworks 2022-10-12 07:01:20 +03:00
supersteve3d
65b973ac4e
Update shared.py
Correct typo to "Unload VAE and CLIP from VRAM when training" in settings tab.
2022-10-12 08:21:52 +08:00
JC_Array
f53f703aeb resolved conflicts, moved settings under interrogate section, settings only show if deepbooru flag is enabled 2022-10-11 18:12:12 -05:00
JC-Array
963d986396
Merge branch 'AUTOMATIC1111:master' into deepdanbooru_pre_process 2022-10-11 17:33:15 -05:00
AUTOMATIC
6be32b31d1 reports that training with medvram is possible. 2022-10-11 23:07:09 +03:00
AUTOMATIC
d6fcc6b87b apply lr schedule to hypernets 2022-10-11 22:03:05 +03:00
AUTOMATIC1111
419e539fe3
Merge branch 'learning_rate-scheduling' into learnschedule 2022-10-11 21:50:19 +03:00
AUTOMATIC
6a9ea5b41c prevent extra modules from being saved/loaded with hypernet 2022-10-11 19:22:30 +03:00
AUTOMATIC
d4ea5f4d86 add an option to unload models during hypernetwork training to save VRAM 2022-10-11 19:03:08 +03:00
AUTOMATIC
6d09b8d1df produce error when training with medvram/lowvram enabled 2022-10-11 18:33:57 +03:00
JC_Array
ff4ef13dd5 removed unneeded print 2022-10-11 10:24:27 -05:00
AUTOMATIC
d682444ecc add option to select hypernetwork modules when creating 2022-10-11 18:04:47 +03:00
AUTOMATIC1111
4f96ffd0b5
Merge pull request #2201 from alg-wiki/textual__inversion
Textual Inversion: Preprocess and Training will only pick-up image files instead
2022-10-11 17:25:36 +03:00
brkirch
861db783c7 Use apply_hypernetwork function 2022-10-11 17:24:00 +03:00
brkirch
574c8e554a Add InvokeAI and lstein to credits, add back CUDA support 2022-10-11 17:24:00 +03:00
brkirch
98fd5cde72 Add check for psutil 2022-10-11 17:24:00 +03:00
brkirch
c0484f1b98 Add cross-attention optimization from InvokeAI
* Add cross-attention optimization from InvokeAI (~30% speed improvement on MPS)
* Add command line option for it
* Make it default when CUDA is unavailable
2022-10-11 17:24:00 +03:00
AUTOMATIC1111
f7e86aa420
Merge pull request #2227 from papuSpartan/master
Refresh list of models/ckpts upon hitting restart gradio in the setti…
2022-10-11 17:15:19 +03:00
AUTOMATIC
66b7d7584f become even stricter with pickles
no pickle shall pass
thank you again, RyotaK
2022-10-11 17:03:16 +03:00
papuSpartan
d01a2d0156 move list refresh to webui.py and add stdout indicating it's doing so 2022-10-11 08:31:28 -05:00
AUTOMATIC
b0583be088 more renames 2022-10-11 15:54:34 +03:00
AUTOMATIC
873efeed49 rename hypernetwork dir to hypernetworks to prevent clash with an old filename that people who use zip instead of git clone will have 2022-10-11 15:51:30 +03:00
JamnedZ
a004d1a855 Added new line at the end of ngrok.py 2022-10-11 15:38:53 +03:00
JamnedZ
5992564448 Cleaned ngrok integration 2022-10-11 15:38:53 +03:00
Ben
861297cefe add a space holder 2022-10-11 15:37:04 +03:00
Ben
87b77cad5f Layout fix 2022-10-11 15:37:04 +03:00
Martin Cairns
eacc03b167 Fix typo in comments 2022-10-11 15:36:29 +03:00
Martin Cairns
1eae307607 Remove debug code for checking that first sigma value is same after code cleanup 2022-10-11 15:36:29 +03:00
Martin Cairns
92d7a13885 Handle different parameters for DPM fast & adaptive 2022-10-11 15:36:29 +03:00
AUTOMATIC
530103b586 fixes related to merge 2022-10-11 14:53:02 +03:00
alg-wiki
8bacbca0a1
Removed my local edits to checkpoint image generation 2022-10-11 17:35:09 +09:00
alg-wiki
b2368a3bce
Switched to exception handling 2022-10-11 17:32:46 +09:00
AUTOMATIC
5de806184f Merge branch 'master' into hypernetwork-training 2022-10-11 11:14:36 +03:00
AUTOMATIC
948533950c replace duplicate code with a function 2022-10-11 11:10:17 +03:00
hentailord85ez
5e2627a1a6
Comma backtrack padding (#2192)
Comma backtrack padding
2022-10-11 09:55:28 +03:00
Kenneth
8617396c6d Added slider for deepbooru score threshold in settings 2022-10-11 09:43:16 +03:00
Jairo Correa
8b7d3f1bef Make the ctrl+enter shortcut use the generate button on the current tab 2022-10-11 09:32:03 +03:00
papuSpartan
1add3cff84 Refresh list of models/ckpts upon hitting restart gradio in the settings pane 2022-10-10 19:57:43 -05:00
JC_Array
bb932dbf9f added alpha sort and threshold variables to create process method in preprocessing 2022-10-10 18:37:52 -05:00
JC-Array
47f5e216da
Merge branch 'deepdanbooru_pre_process' into master 2022-10-10 18:10:49 -05:00
JC_Array
76ef3d75f6 added deepbooru settings (threshold and sort by alpha or likelyhood) 2022-10-10 18:01:49 -05:00
JC_Array
b980e7188c corrected tag return in get_deepbooru_tags 2022-10-10 16:52:54 -05:00
JC_Array
a1a05ad2d1 import time missing, added to deepbooru fixxing error on get_deepbooru_tags 2022-10-10 16:47:58 -05:00
alg-wiki
907a88b2d0 Added .webp .bmp 2022-10-11 06:35:07 +09:00
Fampai
2536ecbb17 Refactored learning rate code 2022-10-10 17:10:29 -04:00