Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"KCPP SD Failed to create context!" after trying to load flux.1-schnell #1217

Open
tororon1231 opened this issue Nov 16, 2024 · 7 comments
Open

Comments

@tororon1231
Copy link

Downloaded weights for FLUX.1-schnell. After that this error was shown:


Welcome to KoboldCpp - Version 1.78
For command line arguments, please refer to --help


Auto Selected CUDA Backend...

Attempting to use CPU library.
Initializing dynamic library: koboldcpp_default.dll

ImageGen Init - Load Model: C:\Users\flux1-schnell-q4_k.gguf

Error: KCPP SD Failed to create context!
Load Image Model OK: False

Error: Could not load image model: C:\Users\flux1-schnell-q4_k.gguf

@LostRuins
Copy link
Owner

LostRuins commented Nov 16, 2024

Did you load the t5xxl, clip-l and vae as well? Those are required for flux.
https://huggingface.co/comfyanonymous/flux_text_encoders/tree/main

Also make sure you load the image model in the right place (NOT the text model)

@tororon1231
Copy link
Author

tororon1231 commented Nov 16, 2024

-1-A cat holding a sign that says Nice to meet you!
@LostRuins Nice it worked! I used quantized text encorders, so it doesn't use much space on my machine. Unfortunately the GUI only supports .safetensors, so I needed to change the file extension from .gguf to .safetensors. Could you please add support for selecting .gguf file extensions for text encoders in the launch GUI?

@LostRuins
Copy link
Owner

Oh, I didn't know anyone actually made quantized text encoders. Sure, I can do that.

Glad flux is working well for you. Do share and help others as I've noticed many people struggling to get it to work.

@Pyrooogenic
Copy link

I'm getting this exact same error on KoboldCPP-v1.78.yr0-ROCm, having tried several of the exact same files as this post suggested and even went and tried it on Vulkan through koboldcpp-1.78 just to see if maybe the problem is isolated to that version. But just about everything I've done is not making this work.

image

I even made a post about it on the ROCm discussion board. Same problem, but I also noted that KCPP SD in general just stopped working for me entirely on this latest version.

@LostRuins
Copy link
Owner

Run it with --debugmode , see if any additional information is displayed. Are you sure you've selected all the correct text encoders? Is it only flux that's broken or is sd1.5 also broken?

@Pyrooogenic
Copy link

Run it with --debugmode , see if any additional information is displayed. Are you sure you've selected all the correct text encoders? Is it only flux that's broken or is sd1.5 also broken?

So I made sure to go back and retest to see if I was not crazy on the text encoder side of things.

image

Same exact setup on both ends just using Vulkan on normal version, HIP on ROCm ofc.

And they both worked.

Untitled

Something I did not account for was the possibility of a bad model, the vae, t5xxl, clip_l were all supposed to be from flux.1-schnell which I used, but apparently that didn't help the case of the original model I wanted to use.

I made sure to go back and test both the schnell GGUF I tried before and artsyliteQ4KS, the schnell one now works, but artsy doesn't. Originally schnell itself didn't work with the exact same setup I used, idk what changed.

image

Same with artsyDream, so I was just led astray by my own incompetence.


![image](https://github.com/user-attachments/assets/1054316a-7369-48c1-8494-792bd2ce768f)

For SD1.5 and SDXL (By extension the PonyXL model), my usual main gotos, I made sure to retest since you asked.

![Untitled](https://github.com/user-attachments/assets/7f6b3678-ca0a-4bc1-8495-930ba4e6e9ce)

Works too. 
![388777675-02880a9d-b4ef-447c-8208-2e4dfa5d89d6](https://github.com/user-attachments/assets/2c6692bb-1098-4a08-b0cf-65c80eb9ce63)
While earlier I was experiencing this. Which I'm unsure how to interpret it. I commented here since it really WAS NOT working at all earlier even with the exact same setup.

So I can't really explain it besides just, it happened.

@LostRuins
Copy link
Owner

LostRuins commented Nov 22, 2024

Another possibility is a bad download (you can check SHA256 hashes against the original), or perhaps you were running something else at the time and had insufficient or occupied VRAM. In any case, glad you got it working, and do close this issue if its resolved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants