Fampai
e59c66c008
Optimized code for Ignoring last CLIP layers
2022-10-09 22:31:23 +03:00
AUTOMATIC
6c383d2e82
show model selection setting on top of page
2022-10-09 22:24:07 +03:00
Artem Zagidulin
9ecea0a8d6
fix missing png info when Extras Batch Process
2022-10-09 18:35:25 +03:00
AUTOMATIC
875ddfeecf
added guard for torch.load to prevent loading pickles with unknown content
2022-10-09 17:58:43 +03:00
AUTOMATIC
9d1138e294
fix typo in filename for ESRGAN arch
2022-10-09 15:08:27 +03:00
AUTOMATIC
e6e8cabe0c
change up #2056 to make it work how i want it to plus make xy plot write correct values to images
2022-10-09 14:57:48 +03:00
William Moorehouse
594cbfd8fb
Sanitize infotext output (for now)
2022-10-09 14:49:15 +03:00
William Moorehouse
006791c13d
Fix grabbing the model name for infotext
2022-10-09 14:49:15 +03:00
William Moorehouse
d6d10a37bf
Added extended model details to infotext
2022-10-09 14:49:15 +03:00
AUTOMATIC
542a3d3a4a
fix btoken hypernetworks in XY plot
2022-10-09 14:33:22 +03:00
AUTOMATIC
77a719648d
fix logic error in #1832
2022-10-09 13:48:04 +03:00
AUTOMATIC
f4578b343d
fix model switching not working properly if there is a different yaml config
2022-10-09 13:23:30 +03:00
AUTOMATIC
bd833409ac
additional changes for saving pnginfo for #1803
2022-10-09 13:10:15 +03:00
Milly
0609ce06c0
Removed duplicate definition model_path
2022-10-09 12:46:07 +03:00
AUTOMATIC
6f6798ddab
prevent a possible code execution error (thanks, RyotaK)
2022-10-09 12:33:37 +03:00
AUTOMATIC
0241d811d2
Revert "Fix for Prompts_from_file showing extra textbox."
...
This reverts commit e2930f9821
.
2022-10-09 12:04:44 +03:00
AUTOMATIC
ab4fe4f44c
hide filenames for save button by default
2022-10-09 11:59:41 +03:00
Tony Beeman
cbf6dad02d
Handle case where on_show returns the wrong number of arguments
2022-10-09 11:16:38 +03:00
Tony Beeman
86cb16886f
Pull Request Code Review Fixes
2022-10-09 11:16:38 +03:00
Tony Beeman
e2930f9821
Fix for Prompts_from_file showing extra textbox.
2022-10-09 11:16:38 +03:00
Nicolas Noullet
1ffeb42d38
Fix typo
2022-10-09 11:10:13 +03:00
frostydad
ef93acdc73
remove line break
2022-10-09 11:09:17 +03:00
frostydad
03e570886f
Fix incorrect sampler name in output
2022-10-09 11:09:17 +03:00
Fampai
122d42687b
Fix VRAM Issue by only loading in hypernetwork when selected in settings
2022-10-09 11:08:11 +03:00
AUTOMATIC1111
e00b4df7c6
Merge pull request #1752 from Greendayle/dev/deepdanbooru
...
Added DeepDanbooru interrogator
2022-10-09 10:52:21 +03:00
aoirusann
14192c5b20
Support Download
for txt files.
2022-10-09 10:49:11 +03:00
aoirusann
5ab7e88d9b
Add Download
& Download as zip
2022-10-09 10:49:11 +03:00
AUTOMATIC
4e569fd888
fixed incorrect message about loading config; thanks anon!
2022-10-09 10:31:47 +03:00
AUTOMATIC
c77c89cc83
make main model loading and model merger use the same code
2022-10-09 10:23:31 +03:00
AUTOMATIC
050a6a798c
support loading .yaml config with same name as model
...
support EMA weights in processing (????)
2022-10-08 23:26:48 +03:00
Aidan Holland
432782163a
chore: Fix typos
2022-10-08 22:42:30 +03:00
Edouard Leurent
610a7f4e14
Break after finding the local directory of stable diffusion
...
Otherwise, we may override it with one of the next two path (. or ..) if it is present there, and then the local paths of other modules (taming transformers, codeformers, etc.) wont be found in sd_path/../.
Fix https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/1085
2022-10-08 22:35:04 +03:00
AUTOMATIC
3b2141c5fb
add 'Ignore last layers of CLIP model' option as a parameter to the infotext
2022-10-08 22:21:15 +03:00
AUTOMATIC
e6e42f98df
make --force-enable-xformers work without needing --xformers
2022-10-08 22:12:23 +03:00
Fampai
1371d7608b
Added ability to ignore last n layers in FrozenCLIPEmbedder
2022-10-08 22:10:37 +03:00
DepFA
b458fa48fe
Update ui.py
2022-10-08 20:38:35 +03:00
DepFA
15c4278f1a
TI preprocess wording
...
I had to check the code to work out what splitting was 🤷🏿
2022-10-08 20:38:35 +03:00
Greendayle
0ec80f0125
Merge branch 'master' into dev/deepdanbooru
2022-10-08 18:28:22 +02:00
AUTOMATIC
3061cdb7b6
add --force-enable-xformers option and also add messages to console regarding cross attention optimizations
2022-10-08 19:22:15 +03:00
AUTOMATIC
f9c5da1592
add fallback for xformers_attnblock_forward
2022-10-08 19:05:19 +03:00
Greendayle
01f8cb4447
made deepdanbooru optional, added to readme, automatic download of deepbooru model
2022-10-08 18:02:56 +02:00
Artem Zagidulin
a5550f0213
alternate prompt
2022-10-08 18:12:19 +03:00
C43H66N12O12S2
cc0258aea7
check for ampere without destroying the optimizations. again.
2022-10-08 17:54:16 +03:00
C43H66N12O12S2
017b6b8744
check for ampere
2022-10-08 17:54:16 +03:00
Greendayle
5329d0aba0
Merge branch 'master' into dev/deepdanbooru
2022-10-08 16:30:28 +02:00
AUTOMATIC
cfc33f99d4
why did you do this
2022-10-08 17:29:06 +03:00
Greendayle
2e8ba0fa47
fix conflicts
2022-10-08 16:27:48 +02:00
Milly
4f33289d0f
Fixed typo
2022-10-08 17:15:30 +03:00
AUTOMATIC
27032c47df
restore old opt_split_attention/disable_opt_split_attention logic
2022-10-08 17:10:05 +03:00
AUTOMATIC
dc1117233e
simplify xfrmers options: --xformers to enable and that's it
2022-10-08 17:02:18 +03:00