Commit graph

1700 commits

Author SHA1 Message Date
AUTOMATIC1111
89237852f4
Merge pull request #5119 from 0xb8/master
Atomically rename saved image to avoid race condition with other processes
2022-12-10 13:26:07 +03:00
AUTOMATIC1111
cce306cb67
Merge pull request #5441 from timntorres/add-5433-avoid-sending-size-option
Add option to avoid sending size between interfaces.
2022-12-10 11:07:16 +03:00
AUTOMATIC1111
ec5e072124
Merge pull request #4841 from R-N/vae-fix-none
Fix None option of VAE selector
2022-12-10 09:58:20 +03:00
AUTOMATIC
bab91b1279 add Noise multiplier option to infotext 2022-12-10 09:51:26 +03:00
AUTOMATIC1111
8ee1acc1e4
Merge pull request #5373 from mezotaken/master
add noise strength parameter similar to NAI
2022-12-10 09:36:24 +03:00
AUTOMATIC1111
e5e557fa5d
Merge pull request #5404 from szhublox/merger-ram-usage
Merger ram usage
2022-12-10 09:33:39 +03:00
AUTOMATIC
505ec7e4d9 cleanup some unneeded imports for hijack files 2022-12-10 09:17:39 +03:00
AUTOMATIC
7dbfd8a7d8 do not replace entire unet for the resolution hack 2022-12-10 09:14:45 +03:00
AUTOMATIC1111
2641d1b83b
Merge pull request #4978 from aliencaocao/support_any_resolution
Patch UNet Forward to support resolutions that are not multiples of 64
2022-12-10 08:45:41 +03:00
AUTOMATIC1111
4d5fe3bfc0
Merge pull request #5555 from ywx9/master
Bug fix (a few lines in modules/api/api.py)
2022-12-10 08:27:44 +03:00
AUTOMATIC1111
a42a8e9112
Merge pull request #5547 from Ju1-js/master
Make "# settings changed" grammatically correct
2022-12-10 08:20:22 +03:00
ywx9
9539c2045a Bug fix 2022-12-09 23:03:06 +09:00
Ju1-js
ce04ba71b8 Make # settings changed message grammatically correct
Make the ": " in the settings changed message not show if 0 settings were changed.
"0 settings changed: ." -> "0 settings changed."
2022-12-08 22:47:45 -08:00
Jay Smith
1ed4f0e228 Depth2img model support 2022-12-08 20:50:08 -06:00
timntorres
7057c72ae3 Add opt. to avoid sending size between interfaces. 2022-12-05 03:41:36 -08:00
Mackerel
681c450ecd extras.py: use as little RAM as possible, misc fixes
maximum of 2 models loaded at once. delete unneeded model before next
step. fix 'teritary' -> 'tertiary'. gracefully fail when "add
difference" is selected without a tertiary model
2022-12-04 10:31:06 -05:00
AUTOMATIC
44c46f0ed3 make it possible to merge inpainting model with non-inpainting one 2022-12-04 12:30:44 +03:00
AUTOMATIC
8504db5170 fix #4459 breaking inpainting when the option is not specified. 2022-12-04 01:04:24 +03:00
AUTOMATIC
60bd4d52a6 fix incorrect file extension filter for deepdanbooru models 2022-12-03 18:46:09 +03:00
AUTOMATIC
4b0dc206ed use modelloader for #4956 2022-12-03 18:45:51 +03:00
AUTOMATIC1111
2a649154ec
Merge pull request #4956 from TiagoSantos81/offline_BLIP
[CLIP interrogator] use local file, if available
2022-12-03 18:17:56 +03:00
AUTOMATIC
0d21624cee move #5216 to the extension 2022-12-03 18:16:19 +03:00
AUTOMATIC
89e1df013b Merge remote-tracking branch 'wywywywy/autoencoder-hijack' 2022-12-03 18:08:10 +03:00
AUTOMATIC
b6e5edd746 add built-in extension system
add support for adding upscalers in extensions
move LDSR, ScuNET and SwinIR to built-in extensions
2022-12-03 18:06:33 +03:00
Vladimir Repin
cf3e844d1d add noise strength parameter similar to NAI 2022-12-03 18:05:47 +03:00
AUTOMATIC
46b0d230e7 add comment for #4407 and remove seemingly unnecessary cudnn.enabled 2022-12-03 16:01:23 +03:00
AUTOMATIC
2651267e3a fix #4407 breaking UI entirely for card other than ones related to the PR 2022-12-03 15:57:52 +03:00
AUTOMATIC1111
681c0003df
Merge pull request #4407 from yoinked-h/patch-1
Fix issue with 16xx cards
2022-12-03 10:30:34 +03:00
AUTOMATIC1111
d2e5b4edfa
Merge pull request #5251 from adieyal/bug/negative-prompt-infotext
Fixed incorrect negative prompt text in infotext
2022-12-03 10:21:43 +03:00
AUTOMATIC1111
c9a2cfdf2a
Merge branch 'master' into racecond_fix 2022-12-03 10:19:51 +03:00
AUTOMATIC1111
5cd5a672f7
Merge pull request #4459 from kavorite/color-sketch-inpainting
add `--gradio-inpaint-tool` and option to specify `color-sketch`
2022-12-03 10:06:27 +03:00
AUTOMATIC1111
a2feaa95fc
Merge pull request #5194 from brkirch/autocast-and-mps-randn-fixes
Use devices.autocast() and fix MPS randn issues
2022-12-03 09:58:08 +03:00
AUTOMATIC
c7af672186 more simple config option name plus mouseover hint for clip skip 2022-12-03 09:41:39 +03:00
AUTOMATIC1111
c67d8bca4f
Merge pull request #5304 from space-nuko/fix/clip-skip-application
Fix clip skip of 1 not being restored from prompts
2022-12-03 09:37:10 +03:00
AUTOMATIC1111
28c79b8f05
Merge pull request #5328 from jcowens/fix-typo
fix typo
2022-12-03 09:20:39 +03:00
AUTOMATIC1111
eb0b8f92bc
Merge pull request #5331 from smirkingface/openaimodel_fix
Fixed AttributeError where openaimodel is not found
2022-12-03 09:18:36 +03:00
AUTOMATIC1111
bab6ea6b22
Merge pull request #5340 from PhytoEpidemic/master
Fix divide by 0 error
2022-12-03 09:17:54 +03:00
AUTOMATIC
b2f17dd367 prevent include_init_images from being passed to StableDiffusionProcessingImg2Img in API #4989 2022-12-03 09:15:24 +03:00
AUTOMATIC1111
ae81b377d4
Merge pull request #5165 from klimaleksus/fix-sequential-vae
Make VAE step sequential to prevent VRAM spikes, will fix #3059, #2082, #2561, #3462
2022-12-03 08:29:56 +03:00
PhytoEpidemic
119a945ef7
Fix divide by 0 error
Fix of the edge case 0 weight that occasionally will pop up in some specific situations. This was crashing the script.
2022-12-02 12:16:29 -06:00
SmirkingFace
da698ca92e Fixed AttributeError where openaimodel is not found 2022-12-02 13:47:02 +01:00
jcowens
99b19b1a8f fix typo 2022-12-02 02:53:26 -08:00
SmirkingFace
e461477869 Fixed safe.py for pytorch 1.13 ckpt files 2022-12-02 11:12:13 +01:00
space-nuko
be2e6de94a Fix clip skip of 1 not being restored from prompts 2022-12-01 11:34:16 -08:00
brkirch
0fddb4a1c0 Rework MPS randn fix, add randn_like fix
torch.manual_seed() already sets a CPU generator, so there is no reason to create a CPU generator manually. torch.randn_like also needs a MPS fix for k-diffusion, but a torch hijack with randn_like already exists so it can also be used for that.
2022-11-30 10:33:42 -05:00
brkirch
4d5f1691dd Use devices.autocast instead of torch.autocast 2022-11-30 10:33:42 -05:00
brkirch
21effd629d Add workaround for using MPS with torchsde 2022-11-30 10:33:39 -05:00
Adi Eyal
a44994e2c9 Fixed incorrect negative prompt text in infotext
Previously only the first negative prompt in all_negative_prompts was
being used for infotext. This fixes that by selecting the index-th
negative prompt
2022-11-30 15:23:53 +02:00
Billy Cao
3a724e91a2 Change to steps of 8 2022-11-30 20:52:32 +08:00
wywywywy
7193814cf7
Added purpose of this hijack to comments 2022-11-29 19:22:53 +00:00