Commit graph

1275 commits

Author SHA1 Message Date
AUTOMATIC
61836bd544 shorten Hypernetwork strength in infotext and omit it when it's the default value. 2022-10-30 08:48:53 +03:00
AUTOMATIC1111
470f184176
Merge pull request #3831 from timntorres/3825-save-hypernet-strength-to-info
Save Hypernetwork strength to infotext.
2022-10-30 08:47:18 +03:00
AUTOMATIC
05a657dd35 fix broken hires fix 2022-10-30 07:41:56 +03:00
timntorres
66d038f6a4 Read hypernet strength from PNG info. 2022-10-29 15:00:08 -07:00
timntorres
e709afb0f7 Merge commit 'e7254746bbfbff45099db44a8d4d25dd6181877d' into 3825-save-hypernet-strength-to-info 2022-10-29 14:55:30 -07:00
AUTOMATIC1111
c328deb5f1
Merge pull request #3934 from bamarillo/api-add-png-info-endpoint
[API][Feature] Add png info endpoint
2022-10-29 22:20:50 +03:00
AUTOMATIC
9bb6b6509a add postprocess call for scripts 2022-10-29 22:20:02 +03:00
Bruno Seoane
83a1f44ae2 Fix space 2022-10-29 16:10:00 -03:00
Bruno Seoane
4609b83cd4 Add PNG Info endpoint 2022-10-29 16:09:19 -03:00
AUTOMATIC
35c45df28b fix broken ↙ button, fix field paste ignoring most of useful fields for for #3768 2022-10-29 10:56:19 +03:00
timntorres
2c4d203884 Revert "Explicitly state when Hypernet is none." 2022-10-29 00:36:51 -07:00
timntorres
e98f72be33
Merge branch 'AUTOMATIC1111:master' into 3825-save-hypernet-strength-to-info 2022-10-29 00:31:23 -07:00
AUTOMATIC
beb6fc2979 move send seed option to UI section and make it false by default 2022-10-29 09:57:22 +03:00
AUTOMATIC1111
9553a7e071
Merge pull request #3818 from jwatzman/master
Reduce peak memory usage when changing models
2022-10-29 09:16:00 +03:00
AUTOMATIC
28e6d4a54e add element ids for save buttons for #3798 2022-10-29 09:13:36 +03:00
AUTOMATIC1111
76086f6668
Merge branch 'master' into modal-save-button-and-shortcut 2022-10-29 09:11:00 +03:00
AUTOMATIC1111
f3454b8a6b
Merge pull request #3691 from xmodar/arabic
Revamped Arabic localization
2022-10-29 09:03:35 +03:00
AUTOMATIC
2922d8144f make existing image browser extension not break 2022-10-29 09:01:04 +03:00
AUTOMATIC
af547f63c3 Merge branch 'Inspiron' 2022-10-29 08:48:11 +03:00
AUTOMATIC
3c207ca684 add needed imports fr new code in copypaste.py 2022-10-29 08:42:34 +03:00
AUTOMATIC
a33d0a9a65 remove weird spaces added to ui.py over time 2022-10-29 08:28:48 +03:00
AUTOMATIC
2d220afb24 fix open folder button not working 2022-10-29 08:26:12 +03:00
AUTOMATIC
a1e5e0d766 skip filenames starting with . for img2img and extras batch modes 2022-10-29 08:11:03 +03:00
AUTOMATIC1111
cf8da8e1b0
Merge pull request #3826 from ANTONIOPSD/patch-1
Natural sorting for dropdown checkpoint list
2022-10-29 08:02:03 +03:00
AUTOMATIC1111
810e6a407d
Merge pull request #3858 from R-N/log-csv
Fix log off by 1 #3847
2022-10-29 07:55:20 +03:00
AUTOMATIC1111
3019452927
Merge pull request #3803 from FlameLaw/master
Fixed proper dataset shuffling
2022-10-29 07:52:51 +03:00
AUTOMATIC1111
86e19fe873
Merge pull request #3669 from random-thoughtss/master
Added option to use unmasked conditioning image for inpainting model.
2022-10-29 07:49:48 +03:00
AUTOMATIC1111
1fba573d24
Merge pull request #3874 from cobryan05/extra_tweak
Extras Tab - Option to upscale before face fix, caching improvements
2022-10-29 07:44:17 +03:00
AUTOMATIC
bce5adcd6d change default hypernet activation function to linear 2022-10-29 07:37:06 +03:00
AUTOMATIC1111
f3685281e2
Merge pull request #3877 from Yaiol/master
Filename tags are wrongly referencing to process size instead of image size
2022-10-29 07:32:11 +03:00
AUTOMATIC1111
d3b4b9d7ec
Merge pull request #3717 from benkyoujouzu/master
Add missing support for linear activation in hypernetwork
2022-10-29 07:30:14 +03:00
AUTOMATIC1111
fc89495df3
Merge pull request #3771 from aria1th/patch-12
Disable unavailable or duplicate options for Activation functions
2022-10-29 07:29:02 +03:00
Bruno Seoane
0edf100d83
Merge branch 'AUTOMATIC1111:master' into master 2022-10-28 22:03:49 -03:00
AngelBottomless
f361e804eb
Re enable linear 2022-10-29 08:36:50 +09:00
Yaiol
539c0f51e4 Update images.py
Filename tags [height] and [width] are wrongly referencing to process size instead of resulting image size. Making all upscale files named wrongly.
2022-10-29 01:07:01 +02:00
Chris OBryan
d8b3661467 extras: upscaler blending should not be considered in cache key 2022-10-28 16:55:02 -05:00
Chris OBryan
5732c0282d extras-tweaks: autoformat changed lines 2022-10-28 16:36:25 -05:00
Chris OBryan
1f1b327959 extras: Make image cache LRU
This changes the extras image cache into a Least-Recently-Used
cache. This allows more experimentation with different upscalers
without missing the cache.

Max cache size is increased to 5 and is cleared on source image
update.
2022-10-28 16:14:21 -05:00
Chris OBryan
bde4731f1d extras: Rework image cache
Bit of a refactor to the image cache to make it easier to extend.
Also takes into account the entire image instead of just a cropped portion.
2022-10-28 14:44:25 -05:00
Chris OBryan
26d0819384 extras: Add option to run upscaling before face fixing
Face restoration can look much better if ran after upscaling, as it
allows the restoration to fix upscaling artifacts. This patch adds
an option to choose which order to run upscaling/face fixing in.
2022-10-28 13:33:49 -05:00
Muhammad Rizqi Nur
9ceef81f77 Fix log off by 1 2022-10-28 20:48:08 +07:00
timntorres
db5a354c48 Always ignore "None.pt" in the hypernet directory. 2022-10-28 01:41:57 -07:00
timntorres
c0677b3316 Explicitly state when Hypernet is none. 2022-10-27 23:31:45 -07:00
timntorres
d4a069a23c Read hypernet strength from PNG info. 2022-10-27 23:16:27 -07:00
timntorres
9e465c8aa5 Add strength to textinfo. 2022-10-27 23:03:34 -07:00
benkyoujouzu
b2a8b263b2 Add missing support for linear activation in hypernetwork 2022-10-28 12:54:59 +08:00
Antonio
5d5dc64064
Natural sorting for dropdown checkpoint list
Example:

Before					After

11.ckpt					11.ckpt
ab.ckpt					ab.ckpt
ade_pablo_step_1000.ckpt	ade_pablo_step_500.ckpt			
ade_pablo_step_500.ckpt	ade_pablo_step_1000.ckpt	
ade_step_1000.ckpt		ade_step_500.ckpt
ade_step_1500.ckpt		ade_step_1000.ckpt
ade_step_2000.ckpt		ade_step_1500.ckpt
ade_step_2500.ckpt		ade_step_2000.ckpt
ade_step_3000.ckpt		ade_step_2500.ckpt
ade_step_500.ckpt			ade_step_3000.ckpt
atp_step_5500.ckpt			atp_step_5500.ckpt
model1.ckpt				model1.ckpt
model10.ckpt				model10.ckpt
model1000.ckpt			model33.ckpt
model33.ckpt				model50.ckpt
model400.ckpt			model400.ckpt
model50.ckpt				model1000.ckpt
moo44.ckpt				moo44.ckpt
v1-4-pruned-emaonly.ckpt	v1-4-pruned-emaonly.ckpt
v1-5-pruned-emaonly.ckpt	v1-5-pruned-emaonly.ckpt
v1-5-pruned.ckpt			v1-5-pruned.ckpt
v1-5-vae.ckpt				v1-5-vae.ckpt
2022-10-28 05:49:39 +02:00
Josh Watzman
b50ff4f4e4 Reduce peak memory usage when changing models
A few tweaks to reduce peak memory usage, the biggest being that if we
aren't using the checkpoint cache, we shouldn't duplicate the model
state dict just to immediately throw it away.

On my machine with 16GB of RAM, this change means I can typically change
models, whereas before it would typically OOM.
2022-10-27 22:01:06 +01:00
random_thoughtss
b68c7c437e Updated name and hover text. 2022-10-27 11:45:35 -07:00
random_thoughtss
a38496c1de Moved mask weight config to SD section 2022-10-27 11:31:31 -07:00