Commit graph

1174 commits

Author SHA1 Message Date
DepFA
15c4278f1a TI preprocess wording
I had to check the code to work out what splitting was 🤷🏿
2022-10-08 20:38:35 +03:00
Greendayle
0ec80f0125
Merge branch 'master' into dev/deepdanbooru 2022-10-08 18:28:22 +02:00
AUTOMATIC
3061cdb7b6 add --force-enable-xformers option and also add messages to console regarding cross attention optimizations 2022-10-08 19:22:15 +03:00
AUTOMATIC
f9c5da1592 add fallback for xformers_attnblock_forward 2022-10-08 19:05:19 +03:00
Greendayle
01f8cb4447 made deepdanbooru optional, added to readme, automatic download of deepbooru model 2022-10-08 18:02:56 +02:00
Artem Zagidulin
a5550f0213 alternate prompt 2022-10-08 18:12:19 +03:00
DepFA
34acad1628 Add GZipMiddleware to root demo 2022-10-08 18:03:16 +03:00
C43H66N12O12S2
cc0258aea7 check for ampere without destroying the optimizations. again. 2022-10-08 17:54:16 +03:00
C43H66N12O12S2
017b6b8744 check for ampere 2022-10-08 17:54:16 +03:00
C43H66N12O12S2
7e639cd498 check for 3.10 2022-10-08 17:54:16 +03:00
Greendayle
5329d0aba0 Merge branch 'master' into dev/deepdanbooru 2022-10-08 16:30:28 +02:00
AUTOMATIC
cfc33f99d4 why did you do this 2022-10-08 17:29:06 +03:00
Greendayle
2e8ba0fa47 fix conflicts 2022-10-08 16:27:48 +02:00
Milly
4f33289d0f Fixed typo 2022-10-08 17:15:30 +03:00
AUTOMATIC
27032c47df restore old opt_split_attention/disable_opt_split_attention logic 2022-10-08 17:10:05 +03:00
AUTOMATIC
dc1117233e simplify xfrmers options: --xformers to enable and that's it 2022-10-08 17:02:18 +03:00
AUTOMATIC
7ff1170a2e emergency fix for xformers (continue + shared) 2022-10-08 16:33:39 +03:00
AUTOMATIC1111
48feae37ff
Merge pull request #1851 from C43H66N12O12S2/flash
xformers attention
2022-10-08 16:29:59 +03:00
C43H66N12O12S2
970de9ee68
Update sd_hijack.py 2022-10-08 16:29:43 +03:00
C43H66N12O12S2
7ffea15078
Update requirements_versions.txt 2022-10-08 16:24:06 +03:00
C43H66N12O12S2
ca5f0f149c
Update launch.py 2022-10-08 16:22:38 +03:00
C43H66N12O12S2
69d0053583
update sd_hijack_opt to respect new env variables 2022-10-08 16:21:40 +03:00
C43H66N12O12S2
ddfa9a9786
add xformers_available shared variable 2022-10-08 16:20:41 +03:00
C43H66N12O12S2
26b459a379
default to split attention if cuda is available and xformers is not 2022-10-08 16:20:04 +03:00
C43H66N12O12S2
d0e85873ac
check for OS and env variable 2022-10-08 16:13:26 +03:00
MrCheeze
5f85a74b00 fix bug where when using prompt composition, hijack_comments generated before the final AND will be dropped 2022-10-08 15:48:04 +03:00
guaneec
32e428ff19 Remove duplicate event listeners 2022-10-08 15:47:24 +03:00
ddPn08
772db721a5 fix glob path in hypernetwork.py 2022-10-08 15:46:54 +03:00
AUTOMATIC
7001bffe02 fix AND broken for long prompts 2022-10-08 15:43:25 +03:00
AUTOMATIC
77f4237d1c fix bugs related to variable prompt lengths 2022-10-08 15:25:59 +03:00
C43H66N12O12S2
3f166be1b6
Update requirements.txt 2022-10-08 14:42:50 +03:00
C43H66N12O12S2
4201fd14f5
install xformers 2022-10-08 14:42:34 +03:00
AUTOMATIC
4999eb2ef9 do not let user choose his own prompt token count limit 2022-10-08 14:25:47 +03:00
Trung Ngo
00117a07ef check specifically for skipped 2022-10-08 13:40:39 +03:00
Trung Ngo
786d9f63aa Add button to skip the current iteration 2022-10-08 13:40:39 +03:00
AUTOMATIC
45cc0ce3c4 Merge remote-tracking branch 'origin/master' 2022-10-08 13:39:08 +03:00
AUTOMATIC
706d5944a0 let user choose his own prompt token count limit 2022-10-08 13:38:57 +03:00
leko
616b7218f7 fix: handles when state_dict does not exist 2022-10-08 12:38:50 +03:00
C43H66N12O12S2
91d66f5520
use new attnblock for xformers path 2022-10-08 11:56:01 +03:00
C43H66N12O12S2
76a616fa6b
Update sd_hijack_optimizations.py 2022-10-08 11:55:38 +03:00
C43H66N12O12S2
5d54f35c58
add xformers attnblock and hypernetwork support 2022-10-08 11:55:02 +03:00
AUTOMATIC
87db6f01cc add info about cross attention javascript shortcut code 2022-10-08 10:15:29 +03:00
DepFA
21679435e5 implement removal 2022-10-08 09:43:31 +03:00
DepFA
83749bfc72 context menu styling 2022-10-08 09:43:31 +03:00
DepFA
e21e473253 Context Menus 2022-10-08 09:43:31 +03:00
brkirch
f2055cb1d4 Add hypernetwork support to split cross attention v1
* Add hypernetwork support to split_cross_attention_forward_v1
* Fix device check in esrgan_model.py to use devices.device_esrgan instead of shared.device
2022-10-08 09:39:17 +03:00
Jairo Correa
a958f9b3fd edit-attention browser compatibility and readme typo 2022-10-08 09:38:44 +03:00
C43H66N12O12S2
b70eaeb200
delete broken and unnecessary aliases 2022-10-08 04:10:35 +03:00
C43H66N12O12S2
c9cc65b201
switch to the proper way of calling xformers 2022-10-08 04:09:18 +03:00
Greendayle
5f12e7efd9 linux test 2022-10-07 20:58:30 +02:00